robots txt file for ecommerce site ASAP

  • Status Closed
  • Budget N/A
  • Total Bids 9

Project Description

I need a Robots .TXt file for my ecommerce website.

The website is creating duplicate meta description and title tag issues as urls are created for currency conversion for each page and also pagination for each product category. We need to block those urls but still have the category and regular product pages accessable.

We are being penalised for dulicate content by Google

There are 237 URLs we need to block

169 are like this:

/process/shop/[url removed, login to view]

The rest look like this,

/shop/category/cat3425/[url removed, login to view]

Need this done Today, hope you can help

Get free quotes for a project like this
Skills Required

Looking to make some money?

  • Set your budget and the timeframe
  • Outline your proposal
  • Get paid for your work

Hire Freelancers who also bid on this project

    • Forbes
    • The New York Times
    • Time
    • Wall Street Journal
    • Times Online