We are looking at designing and developing a bespoke and tailor-made, content managed website which enables Us to add properties, property images, testimonials etc into the website through the back end and there will be a contact us form integrated with Google Maps to get the enquiries right to the back end. The website will be a 8-10 pages one, which
Get the details by using the URLs in the Google sheet. There is about 566 URL in the sheet & you have to get the information as given below. URL : [login to view URL] First Name Last Name Middle name Agency Organization Job Title Duty Station Phone Fax Email Easy
I need a list of website URLs along with contact information (phone number preferably but if there's no phone number then email, if no email then the URL to the contact page with submit form). so it would be provided to me as an EXCEL file colum1 is URL colum2 is contact information as outlined above column 3 IF POSSIBLE I want ones that are in the
We need some minor SEO changes made for our existing site, [login to view URL] For example: 1) Page titles 2) Metatitles 3) Descriptions The keywords we would be looking to improve on include: 'Gyms belfast' 'Gym day pass belfast' 'Gym near me belfast'. We aren't looking for huge changes but just to make sure there are no glaring SEO issues for our google ...
Looking to collect a list of URLs with public pricing for US hospitals (which are required to post them). This is not a complete list of hospitals, but it's a good start: [login to view URL] There are 4,793 records in this list with varying fields. For this project what I need done is to
...below (52 in total) • Grab a list of all of the services they offer - this information might be on the Homepage, “Services" Page, “What we do” page, or anywhere else on the website. • Take full page screenshots for different screen sizes, of the following pages if available or if they have a similar name. o Pages ♣ Main/homepage ♣ Service...
I want a Chrome extension developed which will match the current URL I am visiting in Chrome against a list of participating retailers on following Virgin Australia frequent flyer shopping portal ([login to view URL]) and alert me that I can earn FF points by going to the portal and clicking through their tracking link. I would also like the extension to tell me how many FF points I will earn wi...
Hello I want only URLs More details shared with the selected freelancer For this easy task, I will pay 100 INR Only Indians Freelancers Bid Thank You
Get all URLs of Custom sizes images in WordPress using WP all Export.
I am looking for a freelancer who are expert in PHP. I have got a problem with generating the urls using harsh because all urls come the same urls when one of them should be different. I want to generate the urls that come with the urls and email address. If you can be able to do it for me for $15 that would be great. When you are done, please upload
Necesito una evaluación de SEO Onpage de 2 Urls de una página Web en donde se evalúen todos los puntos de SEO que debe tener una URL de uno o varios Keywords. Entre los puntos a evaluar están los siguientes: - Metadata (Title, Description) - HTTP Status Codes - Robots Meta Directives - Content of page (density, keyword stuffing, duplicate, thiny etc)
I need a UI very simple in python which gets rate from different websites then compare all rates and do some data manipulation in parallel. I will describe more on this project on chat. What you need to have: expert in python good hands on Websockets, threading and asnyc method deliver project in few days able to understand project very well
I need a Python 3.6 script which will: 1) work through a large .txt file list of Google URLs, e.g. [login to view URL] [login to view URL] [login to view URL] 2) Extract the URLs (including SIG= elements) for every page within the above IDs, including the pages before
Need to be done in 3 days. URLs provided you have to put information on the sheet 3-6 columns for a single record its very simple and easy task. Will have more work similar to this one who completes within time.
Script or program to record redirected urls into Excel/csv and capture basic web CSS elements I would like to have it running locally on Win10 machine/internet access without any additional software installed. Also capable to do the same with google sheets online (nice to have feature but not a must) If you have good ideas like an app or so let me
Move websites from one server to other. Some are wordpress and some custom php. You should have fast internet for download and upload. Please add "Move website" to your bid so I know you read the project. Budget: 700 for 7 websites.
...proxy setup to point subdomains to horrible URL's with port numbers. I have a box I use to run a Plex server but it has no apache or nginx or lightspeed or ANY web server software so the urls are unmanageable example: [login to view URL] (this is fake) I have a VPS that can be used with virtualmin/webmin to setup DNS for the domains
[login to view URL]?pagina=2&paginazione=SI [login to view URL] School name full address the city name in a separate column phone number website URL an email address column for -1st url i need the file today easy sites no page limitation or any blocks
Get headless chrome (with chromedriver, selenium, python) to automatically continue through iframe redirect urls. Current Chrome headless stops at a page if there is an iframe. This may be normal behavior. I want an option to where it takes the first iframe src on a page (if it exists), and continues onto that url.
Hi, my website does not have Seo friendly urls, it is made on php by another developer that does not know how to do it. So i need someone to correct both Product and Search pages, to be seo friendly. If you have more questions let me know. To proof you read description let me know code 1023 Here is a sample of the product page: http://indiceimoveis
I have a spreadsheet with 12,058 names on it and 96,464 [login to view URL] URLs that take you to search pages. You will create a spreadsheet in which the URLs are replaced with the number of search results that appear on that URL's page. For example, cell B2's URL is [login to view URL] . The number of search results on that page is 18382
You do not need to write any tags. The tags are already written for these webpages - I will send you a list of 400 URLs in Excel and what I need from you is to send me back an excel sheet with the title and meta tag for each of the URL
...automatically detect and remove urls from meta description (before they get generated). There is one file ([login to view URL]) that is responsible of the meta description generation. You would just need to create a php function that detects urls, such as this one : [login to view URL] In the
Hello, I have a list of images URLs in an excel file, I would like to have a macro file that I can use often to download the images and then in next column put his path that the image has been download, then in the next cell, the path of the image renamed. Thanks to contact me to discuss about it
I have a simple PHP Crawler for single URL it crawls and saves record into DB Now we need a new Freelancer who has skills in PHP Crawling work It should update the sour...Crawler for single URL it crawls and saves record into DB Now we need a new Freelancer who has skills in PHP Crawling work It should update the source code to Crawl for Multiple URLs
We have a list of 4000 companies (names, URL) in which there are several duplicates. Your task - find and group the duplicates next to each other. Remove nonduplicates.
...remove urls from the meta tag ? I know this is a problem with vbulletin engine but i've already asked them about sorting this issue and they said it was out of their charge. By the way, the 2 php files responsible of the meta description tags include it and they can be seen here : [login to view URL]
I need a PHP Crawler for multiple URLs. I need a PHP Expert with good knowledge of nested Loop and Crawling the URLs I need at LOW budget
...PHP files or in .htaccess, the search results URLs, to get SEO Friendly URL Structure, on [login to view URL] Example: Instead of (before click): [login to view URL]=1&data_location_1[country]=189 after click [login to view URL] I prefer get nice friendly URLs as: (before and after users click on the link
... Under the box will be a button called "Get Title Tag Links" The box will allow for up to 5000 URLs to be entered. When someone enters in urls, and then clicks on the button, the tool will then go to all the urls listed and grab the title tag of each urls and will make the title tag the link to the page. See the attachment. I would like this tool done
Got a couple of postback URLs. We need to post their parameters into mysql database when a hit comes in.
Looking for someone to Rewrite the search results URLs to get SEO Friendly URL Structure Example: Instead of (before click): [login to view URL]=1&data_location_1[country]=189 after click [login to view URL] I prefer get nice friendly URLs as: (before and after users click on the link.) matrimo.
Looking for someone to Rewrite some URLs. Example: Instead of (before click): [login to view URL]=1&data_location_1[country]=189 after click [login to view URL] I prefer get nice friendly URLs as: (before and after users click on the link.) [login to view URL] Another example: Instead
...dynamic sitemap that is updated and sent to search engines. The correct structure of the web is index/category/product. Redirect all other urls to this structure, not product?=tags, product?=search, index/product or anything like that. Only seo friendly urls with the structure index/category/product, the rest redirected and canonical. In the sitemap
10 - 12 hours create a copy of the customers website on your server, debug and fixes to prestashop functionality. 1) All pages working with SSSL and 2) friendly URLs working. 1) prioroty 1 and 2 is priority 2) .
Hi, I have a Prestashop instance at 1.5.5. The "friendly URLS" feature will not work and I need all pages to redirect to https:// . There is a valid cert in place. I need someone to get and sort these two issues out. Please show experience and credentials. I am happy to agree an hourly rate. I am an IT professional but just do not have