Daily Product QTY Web Scrapper
This project received 5 bids from talented freelancers with an average bid price of $172 USD.Get free quotes for a project like this
Project Budget$30 - $250 USD
This project is for the development of Visual Basic (2010 Express) application that can scrape the qty of products on a website each day for the purpose of tracking how many products are sold. The websites this will be used on use a common html template, which allows us to use search parameters to "drill down" to the product QTY field. See the attachment "example" for an example of the div used for the product QTY section of html.
If all the whitespace and linebreaks were stripped, then you could create a filter that worked on every product URL by using the logic shown in attachment "logic" (Freelancer strips html tags if I wrote it here)
With this in mind, I need some configuration settings:
- Configure the search strings for finding product QTY - as shown in the logic example, but I need to be able to search for 3 strings (using the pointer of the last to begin the start point for the next search. If search string 1 or 2 are blank, then they are skipped)
- I also need a checkbox setting on the above configuration for "QTY is After Search". This toggles if the QTY is found after the last (3rd) search string, or if it is found before it. For example: After46 46Before
- Configure a list of products to track (product_sku, url, retail price, raw_price)
The application needs to fetch data when manually initiated by a button and apply this logic:
- If QTY decreased from one scrape to the next, then record the QTY of products sold
- If the QTY remains the same from one scrape to the next then record 0 sold
- If QTY has increased from one scrape to the next then disregard the result (data not valid as the product has been replenished and we can't determine the qty sold). The script needs to be able to recover from this though; and if on the third scrape the qty has decreased compared to the second scrape, then record the qty sold.
There needs to be basic error logging. A log page needs to show after each scrape if any unusual issues occurred such as:
- URL not found (for SKU abc)
- Search string not found (for SKU abc)
- No QTY value found (for SKU abc)
There needs to be a command button to "Clear Data" to begin collecting fresh data. This only clears the QTY data, all product SKU / URL data is preserved.
Last but not least, the application needs a button "Create Report" which outputs a CSV with the following info:
Product SKU, QTY Sold, Price Per Unit, Total Sales, Raw Price Per Unit, Total Profit, Error
Where QTY sold is the total qty sold over the sample range
Where total profit = (Price Per Unit X QTY Sold) - (Raw Price Per Unit X QTY Sold)
If an error occurred on a product, it needs to show the last error logged for that product on the "Error" column
Looking to make some money?
- Set your budget and the timeframe
- Outline your proposal
- Get paid for your work
Hire Freelancers who also bid on this project
Looking for work?
Work on projects like this and make money from home!Sign Up Now
- The New York Times
- Wall Street Journal
- Times Online