I need to scrape emails from [url removed, login to view], street for street. Example: [url removed, login to view],18.041584&z=14&q=%22Torsgatan,%20STOCKHOLM%22;232156865;geo Here you have 765 companies. You have to click in their website and find email adress to the company and place it in an excel. The companies that not have a link you have to find it at google and from there enter their website to find a email adress. The one that can give me emails to these street first will win the contest and then work for me daily. I will pay 7usd/day for 3 new streets/day. Street number 1: [url removed, login to view],18.050795&z=14&q=%22Vegagatan,%20STOCKHOLM%22;232144126;geo Street number 2: [url removed, login to view],18.083496&z=14&q=%22Tyskbagargatan,%20STOCKHOLM%22;231963352;geo Street number 3: [url removed, login to view],18.100834&z=14&q=%22Oxenstiernsgatan,%20STOCKHOLM%22;231980732;geo
I need an automated bot who can forward messages from a discord server to a web server. The bot should run on a webserver and parse data from Discord server A to Discord server B. - Discord server A is a third party server while Discord server B is under our controll. As we can not run a bot on the third party server, the bot has to collect the data as a normal discord user. (unless you find a better solution). On the back end side we need a simple controll panel, such as information and running status. I am open for any questions and ideas.
Query Generation: This is a one-sentence English command writing assignment project with a questionnaire survey at the end. The Worker will work on this task in the form of an excel sheet to generate verbal (spoken) English. This task will require you to think creatively and write conversational English as if you are actually speaking (Whether it's to a friend, AI, or home automation device). Details: - The designated candidate would be generating what we call as 'queries', which are natural English commands (similar to Alexa, Siri, etc, but without the initiation word 'hi Siri', 'hey Alexa'). - He or she would only be dealing with populating data in columns 'E' and 'F', (Platform and Query). - He or she would randomly select a ‘platform’ from 'list of platforms' (from column D) and adding it to cells in Column E. In addition, the candidate will make queries (in Column F) based on the platform in column 'E'. (The test user has to insert the name of the platform in the query to indicate which platform he/she want the 'command' to happen on) - Being meticulous is crucial in this work.(Excel sheets do not check spell errors and anomalies so it will be easy to miss) - After the query generation is done, the candidate will be asked questions in the form of a Google Questionnaire about the queries that he or she generates based on the Actions given. - Lastly, this assignment requires practical data, (meaning write real-world scenario commands/queries). Our queries (commands) will need to be realistic in the sense that it is possible to conduct in our daily lives.
We are looking for a developer to develop the fastest Shopify monitor / scraper. The monitor must run 24/7 on a web server and must be able to handle 70+ Shopify sites simultaneously. The scraper must then push each restock or newly added product to a Discord webhook. The newly added product or restock must be pushed in a format that contains a direct Add to cart link for each size, image, and stock quantity (more details can be explained.) I'm looking to have the best and fastest monitor coded. After the product is developed, you must also have to provide support new Shopify sites and an easy way to update the site list in case one should be added to monitor.
Your work is to get my project done. You should HIRE freelancers to get my projects done. There are various types of projects I can ask you to get done. I give you all the money spent on hiring. Plus I give you monthly payment and bonuses. Once we build trust, you will get stable income every month. You need to work 10 hours a day including weekends. You need to be at least a little good at technology. You need to know how to write correct English. Answer the following questions when you bid. Otherwise, your bid will be immediately removed. What number comes next ? 4 , 6, 9, 6, 14, 6, ? What number comes next ? 1, 3, 7, 11, 13, ? What number comes next? 0, 9, 36, 81, 144, ?
I’m looking for someone that have experience in web scraping/data mining with tools and access to LinkedIn, Vrbo, etc to start asap. Please send your resume to removed by freelancer.com admin]with your rate. Thanks!
Sistema que incluya controles a través de tarjetas RFID que permitan monitorear el acceso de los estudiantes a una escuela, de manera que se emitan alertas en ciertas situaciones relacionadas con su asistencia (dentro de la escuela y el aula). De igual manera incluirá algunas opciones para detectar comportamientos específicos importantes para profesores y directivos.
Se trata de un proyecto generación de estrategia de Big Data para Business Analytics , las fases que queremos cubrir son las de revisión y auditoría de las fuentes de datos (estructurados - no estructurados) y el apoyo técnico en la creación de un Data Lake. Tareas específicas • Identificación de los segmentos y objetivos estratégicos del cliente y su traducción a KPI. • Conocimiento, documentación y revisión del ecosistema de tecnología disponible • Adaptación de los objetivos al ecosistema disponible y propuesta de mejoras • Auditar compilar e integrar diversas BDD • Configuración de las herramientas para el análisis y control de la evolución en función de los objetivos globales. • Análisis de la evolución, propuesta de mejoras y generación de informes. Aptitudes y conocimientos requeridos • Amplia experiencia en soluciones cloud (Google, Amazon , Azure) y bases de datos. • Administración de sistemas operativos y redes. • Arquitectura de centro de procesos de datos y creación de Data Lakes (Cloudera,HortonWorks,MapR) • Familiarización con entornos de procesamientos modernos a escala masiva (Big Data) y/o en tiempo real: Hadoop/Mapreduce, HBase Scala/Spark, dataflow, Storm, Flume. • Conocimientos del entorno Salesforce • Capacidad de análisis y presentación de resultados.
I want to know all the products of the given instagram accounts. You can make a sequence of the likes and comments from high to low. The number of likes, comments and the posting time can be seen and you can click the pictures to open the linkage. Is it made to be used as a single software or a web page? thank you！
The target of this small project is a specific webpage which has a public registry. This registry is searchable by two methods (company name, or Company Number + location). The goal from this project is to scrap and collect all data from this public registry. There are two outcomes that are required from this project to be handed over at the end of it: 1- A script that can be ran on demand (anytime) from a windows PC to scrap the data off the registry whenever required by the owner. The script should spit out the data in txt, excel, or csv format. A sample of the required file will be shared. 2- The output file itself, so I can check if the script is working and the data is correct. The website will be shared with the interested freelancers when requested. The registry has two search methods to return the data: - by adding a company number (4 digits number) & selecting a location from a drop down menu - by company name (minimum is 3 characters to start a search) so the script needs to have a few steps: First run a search by systematically running through all search possibilities one by one , Second go to each one of the results and scrap the data, third create the file, and lastly remove duplicates.
Hello, I'm looking for a developer that will create a rotating proxy API for me. The API needs to be provided with fresh and tested proxys. Here's an exemple of what I want : [url removed, login to view] The API also needs to be private for my personale use only.
I am looking for a skilled developer/Team - who can create a complete crawling environment for crawling around 400 different webpages 4-6 times per day. (1- 2 sites each domain). Plus validate the data and send via API to Client. This is Mobile sites + Insurance sites etc. - Each page is different. Need eparat script for each. With this task follows an ongoing monitoring and maintenance agreement.