We deal with huge solr data (huge rows - 10 billion, but only small size - 200GB) We are trying to index it to solr in fastest way from mysql DB.
The solr importer is custom and works very well, and mysql works very well and is not the bottleneck. The issue is is solr does not commit the documents fast enough. it hangs and then stops after 100 million rows.
We can easily get 100,000 rows per second from all parts, apart from the actual submit / index to solr. More information can be found here:
[url removed, login to view]:changehistory-tabpanel
We need urgent help and advice from an expert who has used solr to index these sort of speeds / data before.
8 freelancers are bidding on average $1434 for this job
Hi, I hope you are fine. You migh have heap and tranaction log problems. For explaination of my appraoch to slow solrCloud runs, kindly see Personal Message. Have a nice day. Kind Regards, ProgrammedFuture
Are you using autocommit for some particular reason? In your case it would make more sense to take ownership of this important step, and tune it according to your needs (Solr offers many ways to do that).