PART A (70$)
We want to be able to import a very large JSON data file with millions of records. Apparently mongoimport has a limit of 100,000 records. According to this (section Batches): [login to view URL]
Is there a way to import millions of records into a collection easily? What is the fastest way to import millions of records into a collection? Write a small simple program to read from a JSON format text file and load millions of records (garbage test data) into a collection.
PART B (30$)
We have another disc in the machine which is a data disc and I want all data for mongo to be on the data disc. We used it in a past install and was forced to run "$ sudo mongod -f /etc/[login to view URL]" in a window at all times for mongo to be able to use the data disc.
We want the data disc to be the default location and I do not want to have to always run this command. Is there a way of doing this? How can this work? Make this work.