I need a script to crawl a very structured website and extract data elements from each page. The data are categorized by states (US) and then cities in each state. The script needed will crawl each state page, then each city and extract the required data to a file. Would like the file to be structured in a way that can be imported into Microsoft Excel, etc.
Script can be written in any language but I like Perl since I'm familiar with it and then I could do future modifications on it if need be... but this is not a requirement.
I know there are lots of packages out there that do web crawling so it might just be a matter of configuring some existing code to extract the data needed...
Thanks,
Bryan
## Deliverables
I'd like to get the output data as part of the final delivery...