...ONE, OR MAKE NEW ONE - AND YOU WILL BE PAID PROMPTLY. LET ME KNOW ASAP PLEASE, THANK YOU! SEE OLD POSTING BELOW: Quick simple project, looking for person with an idea to scrape PHON#NUMBERS from [login to view URL] or [login to view URL] etc. Addresses would be in Excel format - "Street, City, State, ZIP" - which would be used to search for phone numbers by address
Hi If you have a good writer, and list of websites to post at, share it with me includes columns of( url, DA, traffic, ip, niche, price.) in excel file and, share it on google drive. Im planning to get 500 backlinks in the next 12 months. its fix price! for each site. at least 5 backlinks a week.!
We are a well branded news company. We are looking for someone who are familiar in selling banner advertising for our network of news websites. Job description Reaching out to inbound client database. Delicate sales process in contacting clients, follow up and providing relevant information. Requirements Good email pitch Prior experience in digital
objective is to create a program that identifies all the (allowed) webpages in the given list of website using scrapy or any other tool . Note : if any website restrict traversing certain pages , the program should not traverse to those pages. the count of the each website should be in a summary list
I want a [login to view URL] like site, integrated with user/social community style...so that we can rate news from the staff/users viewpoints using criteria...similar to product ratings but for news/articles/papers/videos...using WordPress
I have an easy scrape script I need written in python. I don't know much coding myself, but I'd like to have a script written up for me in python so that I can run it myself in sublime when I wish to grab the data. - All the screen scrape needs to do is count the number of merchants on a company's website. - there are 5 links i need a " merchant count"
I will give you an excel file with links of 100 websites, require you to scrape the required data based on the categories mentioned on the excel file. For example :- [login to view URL] - product page - scrape the product name, product ID, Image, colour, fabrics, descriptions and so on for all the websites given by us.
Needs to hire 3 Freelancers We are looking to bring aboard someone who can help us build a news website that will aggregate many other sources of news and bring it in nicely into our website. The idea is build something similar to this: [login to view URL]
The candidate should promote our website and our trending daily posts to social media platforms ( Facebook, Twitter Etc.,)
I am looking for experienced writer with good writing skills. I need someone who can research and write quality and creative articles that will attract the attention of my readers. Most of the topics are about health, food, automotive, real estate, art & entertainment and sports. This is an ongoing project so I need someone who can work with us continuously
Hi, I would like to have a web app that does the following: - Add to the WebApp some product URL linked to Amazon ASIN via CSV upload. - Scrape the Product URL of different websites into a CSV file (Product Price, Shipping, Product Title, Image, In Stock/Out Of Stock) - Export everything in an CSV file - Convert the CSV file in TXT - Upload TXT into
... Once all the pasting is done, the tool must run a macro which is in Sheet 2 and then throw the excel workbook out with a particular name as per the project name. The good news is ai have already written the VB codes to pull the data from locally saved excel sheet and locally saved word doc. It just needs to be soft coded so that it can take up any
We wish to scrape product stock status from a suppliers webshop. Every day we need the stock levels from all products scraped, the stock level is indicated by a red cross image or a green check image. we only need yes when there is a green check icon and no when the product has a red cross icon. we can provide a list of sku's what can be searched on
...management, file management, and other CRM functionality. - Basic social networking features of some users beyond Consumer to Business (chat between "business users", message board, news feed, internal promotions, dashboard, events, jobs) - Ability for a digital "file box" to file items such as images, files, "items", notes, manuals, bills, invoices etc. - M...
I'm looking to be able to import a list of Whoscored URLs into some sort of tool that will extract the match commentary. The data can be exported into either once workbook or one per match. I've attached a workbook with an example of what the data should look like (doesn't have to be exactly the same). Raw output in another worksheet would be good as well. Please see below ur...
... I'm looking for a professional stock market ghost writer to work with me on daily basis to write articles with a big focus on the Australian Stock Market and news articles topical to the Australian Stock Market /World wide finance/stock market news. Someone who is able to write about a wide variety of finance topics, in a factual way and create
We are looking to have a a program created to scrape data of sales from various e-commerce platforms that we use and place that data into an access database module.
I have 15,000 news articles that I need categorized into one (or more) of ~250 categories. The timeline to complete this is 2-3 weeks, so this will likely require multiple people to complete in that timeframe. I will provide the articles, the categories, and definitions of those categories. I will also be available to help understand what each category
I need to scrape a website on weekly basis. As a result this data should be stored in a mysql database. Website is [login to view URL] I need to scrape that website. I need some 1 who is expert in creating scrappers. There is lot of scrapping involve.
i need a certain real estate website ([login to view URL]) scraped on a weekly basis. Please be awre that we expect around 2 millions results per scrape. So please bid only if you are able to handle this amount of data. You will get 128 querystrings from me where you start scraping down. As a result this data should be stored in a mysql database. This is
...sources on the internet and write news articles to be used on our blog. Category: WWII/History/Vintage news If you have interest in Vintage and World War 2 that is a big bonus as all our articles have to do with that. We need 2-6 articles rewritten every day (7 days per week) but feel free to do more per day. Articles should be 600 words long. No ar...