I need a windows application that will start with a base url, then crawl all links from that page, checking those links also.
It should store the source of each page in a local temp file, and build a report that contains the http status code, the title, length, number of links in, number of links out, time to fetch of each page. It will start with the base url and crawl out from there, but will not follow past one link outside of the base url, and will follow all links within the internal site. After all links are crawled and all the source of each page is stored in local text files , SEO check will be performed to see if each page has a TITLE, has description tags, if the title exceeds a certain length. There are some other tests that specific to our organization that will be detailed later, but they can be accomplished with regular expressions. this app needs to run in windows 7 os. similar applications are AnalogX and Xenu
it should also export a list of results, including all the data collected.