I need a Robots .TXt file for my ecommerce website.
The website is creating duplicate meta description and title tag issues as urls are created for currency conversion for each page and also pagination for each product category. We need to block those urls but still have the category and regular product pages accessable.
We are being penalised for dulicate content by Google
There are 237 URLs we need to block
169 are like this:
/process/shop/[url removed, login to view]
The rest look like this,
/shop/category/cat3425/[url removed, login to view]
Need this done Today, hope you can help