You have chosen to sponsor your bid up to a maximum amount of .
1. Source Fire Snort and Suricata, Packet Decodes – Packet Captures – PCAPS
2. Java script, Java, Apache Pig, Python. webpage app and CSS CS Syle Sheets, Webfront end designs
3. Cloudera/Apache Hadoop / Map reduce / HDFS / Amazon EC / EMR / ozzie / Hive / HBase
Brief: Build a Graphical Web (CSS/CS) Style Sheets Web Page Packet Capture Analyzer tool, with functionality of a Network Forensic Analysis Tool.
Details: Design/Build Web Front End Graphical interface to include widgets, graphics, grafs and bar graphs that are sizeable, zoom-able
This will be 20 pages to launch various java, php and pig scripts, must run/interact with a Hadoop Cluster with the data stored on a (hdfs) or amazon
Program and write specific scripts to analyze data stored in packet captures, decoding packet captures -ie: PCAPS (Raw packet captures) and parse out specific parts of data, correlate data and display specifically components in a specific manner (I have a drawing of a specific page included)
Scripts to Decode packet captures and take each field of a packet and calculate various metrics, ability to group of pcaps into projects or string together.
All widgets on the webpages will operate Scripts that run against various open source projects such as snort, suricata, p0f, graphGL, emr, Trigram cube xplico and a few others -contact me for a complete list.
Programmer will be responsible for completing all aspects of "Phase I" before payment is made, Programmer must sign an NDA as well and provide all source code, scripts, works to me on or before completion of the project. I require a initial Skype conversation prior to awarding this project.
Definitions and Tasks:
1. Data is defined as "Packet Capture" Raw PCAPS (these will be provided to you for testing) the system you design must be able to process a packet capture (pcap) from 100mb to 100tb in size
2. Setup a Hadoop Cluster is Cloundera 4 - CDH4 Cluster (4-6 nodes) on VM systems that will be provided (access via the internet)
3. Install and Configure the following, Suricata, Snort, P0f, dns ubigraph, unigram, Choropleth, pig, and various other packages and tools (as defined*) as part of the package installation *(details provided)- Specific packages will be provided (by name/source with exact requirements for proper execution and operation
3. Advanced Web Front end - start with 20 various pages to +20 pig / java scripts - that will run against data, where the data resides on a hadoop hdfs cluster. These scripts will be run (mapreduce/Hive/HBase/Oizzie) or other required function to interact with the "data" stored on a large Hadoop Cluster.
4. Rewrite twenty 20 (provided) open source python, pig, php and java scripts to perform additional functionality and features (as defined) must present data from specific pcaps in a specific manner on each of these css/cs/style web pages, all page outlines will be provided and quality, functionality, flow, features, usability must match this exactly what I am requesting, I can provide you with exact pages I want.
5. Create 10 new pig/java/php scripts to perform specific functionality and interactions between the hadoop (PCAP data) and display as data output on the webpages, script to add data (pcaps) to the Hadoop cluster, tracking script to search and display certain fields from each packet capture –consisting of hundreds of thousands of packets. Build traffic analytic's to identify trends and patterns in /with in the pcap) as well as source/destination, ip , ports, mac, protocols, streams and must be able to extract all data (files) located in the packet capture for reassembly (graphics, scripts, webpages)
Awarded winner will be provide with the graphical pages, the initial installation details, the initial scripts and some specific sites (showing examples) of exactly how this system should function and feel.