The following message was sent to me by Google:
Googlebot encountered problems while crawling your site.
Googlebot encountered extremely large numbers of links on your site. This may indicate a problem with your site's URL structure. Googlebot may unnecessarily be crawling a large number of distinct URLs that point to identical or similar content, or crawling parts of your site that are not intended to be crawled by Googlebot. As a result Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all of the content on your site.
Common causes of this problem:
Problematic parameters in the URL.. Session IDs, or sorting methods, for example, can create massive amounts of duplication and a greater number of URLs. Similarly, a dynamically generated calendar might generate links to future and previous dates with no restrictions on start or end dates.
Additive filtering of a set of items. Many sites provide different views of the same set of items or search results. Combining filters (for example, show me hotels that are on the beach, are dog-friendly AND have a fitness center), can result in a huge number of mostly redundant URLs.
Dynamic generation of documents as a result of counters, timestamps, or advertisements.
Broken relative links. Broken relative links can often cause infinite spaces. Frequently, this problem arises because of repeated path elements. For example:
[url removed, login to view]
/html/category/community/070413/html/[url removed, login to view]
Steps to resolve this problem
To avoid potential problems with URL structure, we recommend the following:
Whenever possible, shorten URLs by trimming unnecessary parameters. Use the Parameter Handling tool to indicate which URL parameters Google can safely ignore. Make sure to use these cleaner URLs for all internal links. Consider redirecting unnecessarily long URLs to their cleaner versions or using the rel="canonical" link element to specify the preferred, shorter canonical URL.
Wherever possible, avoid the use of session IDs in URLs. Consider using cookies instead. Check our Webmaster Guidelines for additional information.
If your site has an infinite calendar, add a nofollow attribute to links to dynamically created future calendar pages.
Check your site for broken relative links.
If none of the above is possible, consider using a [url removed, login to view] file to block Googlebot's access to problematic URLs. Typically, you should consider blocking dynamic URLs, such as URLs that generate search results, or URLs that can create infinite spaces, such as calendars. Using wildcards in your [url removed, login to view] file can allow you to easily block large numbers of URLs.
I need someone to find and fix all the url issues in the site.
30 freelancers are bidding on average $432 for this job
Dear Sir, We would love to get this opportunity of working with you. We can guarantee that you will not regret your decision if you select us to execute this project. Regards NetzPro