robots txt file for ecommerce site ASAP
- Status Closed
- Budget N/A
- Total Bids 9
I need a Robots .TXt file for my ecommerce website.
The website is creating duplicate meta description and title tag issues as urls are created for currency conversion for each page and also pagination for each product category. We need to block those urls but still have the category and regular product pages accessable.
We are being penalised for dulicate content by Google
There are 237 URLs we need to block
169 are like this:
/process/shop/[url removed, login to view]
The rest look like this,
/shop/category/cat3425/[url removed, login to view]
Need this done Today, hope you can helpGet free quotes for a project like this
Looking to make some money?
- Set your budget and the timeframe
- Outline your proposal
- Get paid for your work
Hire Freelancers who also bid on this project
Looking for work?
Work on projects like this and make money from home!Sign Up Now
- The New York Times
- Wall Street Journal
- Times Online