I am looking for someone to complete a partially created web query that downloads a series of tables from various pages of a website. I have used vba to create the query but I am struggling to link a list of urls in the front sheet so it can loop through them and collect the data. I already have it setup with the urls manually in each query and it works but each time the urls change it is a long process of setting each query again. I can send the current xlsm spread sheet before quoting if required.
Instructions for use;
1. open Data > connections > Connection99 > edit web query
2. This will take you to the login page of the website (an additional job is to creat an auto login - but this can be priced separately)
3. U/name [login to view URL] (AT) hikebikeandride (DOT) com - have used this formate as Freelacer blocks email addresses) P/W forsRS2000
4. Close the webpage and connections forms
5. Click the button on the front page and the macro will collect the data into pages that it will self create.
[login to view URL] all the urls have been scraped it combines the data into one page and saves as a .csv file.
7. The xlsm closes without saving leaving it empty and ready for the next use.
I want to use the urls listed on the front page rather than having to hard code them into the VBA (which is a pain) and so we can change the front page as the buff site adds or deletes pages.
Looping through a list of URLs is a simple thing to do. If that is all, I'd do it right now. However, pls send me the file, so I can verify. Regards, Svet.