Logging & Reporting - Postfix/Amavisd-new - Consolidation, Graphing, Searching


We have a number of mailservers running postfix and amavisd-new. Each email that comes in generates anywhere from x to x lines of logging data (more if a destination relay is down and we have to periodically retry relaying the email.

We need to support ad-hoc searching of coalesced data, efficient generation of reports on a per-domain and per-user basis, and storage of historical patterns (ie email volume on a per-user and per-domain basis).

We need to gather the logs from the various servers to a central location using something like:






The data should be coalesced and stored into a database - we probably don't want every single line logged, and we need to relate log events that happen over a timespan, so there's some sort of coalescing or log consolidation that has to be done. We're currently considering MongoDB with the Toku storage engine, but are open to considering other options. We also need to use somethings like statsd + graphite for easy graphing of things like email traffic, types of messages, on a per-domain and per-user basis.

We then need to be able to search for messages by sender, recipient, date, subject, message-id, etc.

We would provide some logging data to work with. We need someone to configure the logging software to properly parse the data, store it in a database, and then provide us with a number of "ready to go" queries so that we can then integrate this into a tool for our users so that they can view their email logs, search for information, and view near-real-time and historical data about their email usage patterns.

We will give great preference to someone who has already worked on large-scale logging systems. We currently generate about 40million lines of logging data a day and expect it to grow substantially.

Skills: NoSQL Couch & Mongo, Python, Ruby, System Admin

See more: graylog2 flume, amavisd reporting, fluentd postfix, syslog logstash, fluentd toku, usage logging, types of searching in c, graphing of data, graphing in c, graphing data, graphing and data, flume software, data graphing, data for graphing, database consolidation, data and graphing, c graphing, mongodb location, mongodb c++, postfix, graphite, flume, email searching, python consolidation, syslog message

Project ID: #5059267

6 freelancers are bidding on average $1297 for this job


Hi. We are a group of experienced programmers worked with large-scale systems. We have worked with Zenoss monitoring system (community edition) and have developed plugins (named Zenpack) to extend its monitoring c More

$1250 USD in 15 days
(11 Reviews)

Dear Sir or Madam, I am Reda NOUSHI, Director of Sales in the EMEA region at LogZilla Corporation. LogZilla is the leading Log Management solution with Big Data support and instant ad-hoc searches. LogZilla is More

$1111 USD in 3 days
(2 Reviews)

Hello Dear, I am a Linux admin and a programmer with 10+ years of experience. I also have a master degree in AI and my thesis was about Big Data problems. So I know how to handle big databases. Here is a paper of m More

$1263 USD in 20 days
(18 Reviews)

Hi, Greetings from N-Tech Technologies Pvt. Ltd. !!!! Thank you for posting this job, We have gone through your requirement specification and confident to deliver you best "Python" solution as we have expert in More

$1082 USD in 30 days
(1 Review)

HI,we are a small team and we've experts familiar with your proposal.we are already working with similar projects.if you're interested I'm ready to show you the samples.if you chat with me 2 minutes I'll provide a soli More

$1300 USD in 14 days
(0 Reviews)

Hello, we are experienced ruby on rails team from Ukraine. Our Senior Ruby developer has experience in large scale project with 1 million users: http://www.ideeli.com/. So we can collect data with some log gathe More

$1777 USD in 10 days
(0 Reviews)