We are offering a CRM where a duplicate check for addresses is included. This duplicate search still works but we search for a solution to accelerate this.
This acceleration should mainly be realized by modifing the DB/SQL queries. If it is necessary we can modify the postgres configuration as well and discuss othere options. The target is to accelerate it factor 20 for all different DB sizes with:
10,000 Addresses compared with 100
10,000 Addresses compared with 10,000 (itself)
100,000 Addresses compared with 500
100,000 Addresses compared with 100,000 (itself)
500,000 Addresses compared with 1,000
500,000 Addresses compared with 500,000 (itself)
If it becomes accelerated more than factor 100 i double the salary and hope to work with you in further projects.
Please bid on this only if you are very very very experienced in Postgres/SQL techniques to accelerate queries and optimize the server configuration.
You get access to a test DB on a test server with different sized DBs and the Postgres configuration so that you can suggest a different configuration. Please send me a hint which information you need to calculate your bid.
And again: Please only bid if you are a very very very experienced in this topic.
12 freelancers are bidding on average €120 for this job
Hello, I have been working on WCF, C#, BizTalk, ASP.NET for the past 10 years and during this tenure I handled various clients by delivering quality software within specified timeline.