Fix the URL issues on a dynamic site

Avg Bid (USD)
Project Budget (USD)
$250 - $750

Project Description:
The following message was sent to me by Google:

Googlebot encountered problems while crawling your site.

Googlebot encountered extremely large numbers of links on your site. This may indicate a problem with your site's URL structure. Googlebot may unnecessarily be crawling a large number of distinct URLs that point to identical or similar content, or crawling parts of your site that are not intended to be crawled by Googlebot. As a result Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all of the content on your site.

Common causes of this problem:

Problematic parameters in the URL.. Session IDs, or sorting methods, for example, can create massive amounts of duplication and a greater number of URLs. Similarly, a dynamically generated calendar might generate links to future and previous dates with no restrictions on start or end dates.
Additive filtering of a set of items. Many sites provide different views of the same set of items or search results. Combining filters (for example, show me hotels that are on the beach, are dog-friendly AND have a fitness center), can result in a huge number of mostly redundant URLs.
Dynamic generation of documents as a result of counters, timestamps, or advertisements.
Broken relative links. Broken relative links can often cause infinite spaces. Frequently, this problem arises because of repeated path elements. For example:
Steps to resolve this problem
To avoid potential problems with URL structure, we recommend the following:

Whenever possible, shorten URLs by trimming unnecessary parameters. Use the Parameter Handling tool to indicate which URL parameters Google can safely ignore. Make sure to use these cleaner URLs for all internal links. Consider redirecting unnecessarily long URLs to their cleaner versions or using the rel="canonical" link element to specify the preferred, shorter canonical URL.
Wherever possible, avoid the use of session IDs in URLs. Consider using cookies instead. Check our Webmaster Guidelines for additional information.
If your site has an infinite calendar, add a nofollow attribute to links to dynamically created future calendar pages.
Check your site for broken relative links.
If none of the above is possible, consider using a robots.txt file to block Googlebot's access to problematic URLs. Typically, you should consider blocking dynamic URLs, such as URLs that generate search results, or URLs that can create infinite spaces, such as calendars. Using wildcards in your robots.txt file can allow you to easily block large numbers of URLs.

I need someone to find and fix all the url issues in the site.

Skills required:
HTML, Javascript, MySQL, PHP, SEO
About the employer:
Public Clarification Board
Bids are hidden by the project creator. Log in as the employer to view bids or to bid on this project.
You will not be able to bid on this project if you are not qualified in one of the job categories. To see your qualifications click here.

$ 750
in 15 days
$ 1000
in 8 days
$ 250
in 4 days
$ 300
in 7 days
$ 888
in 7 days
$ 400
in 30 days
$ 250
in 4 days
$ 400
in 999 days
$ 300
in 2 days
$ 800
in 10 days