Your guide to getting data entry done for your business
Data entry is an important task, but choosing the wrong solution can seriously harm your company's productivity.
Data Extraction is the process of extracting data from a variety of sources for further analysis. A Data Extractor is someone who helps businesses and organizations gain insight from their data and create descriptive and predictive models. They specialize in finding patterns and relationships that guide decisions and uncover meaningful information. Through carefully crafted queries and processes, our Data Extractors can transform raw data into a useful format that can be used for reporting, analytics, machine learning and more.
Here's some projects that our expert Data Extractors made real:
When you partner with an experienced team of Freelancer's Data Extractors you can access valuable insights from your data that can guide decisions, uncover opportunities and create predictive models with new data sources. Our experts can help you unlock deeper insights with advanced filtering methods and complex coding. Explore the full range of possibilities with our talented community of professionals, capable of delivering comprehensive solutions tailored to your needs.
Ready to launch your very own project on Freelancer.com? We invite you to try us out and hire our experienced Data Extractors to make your design goals a reality. Let their creativity, skill, and proficiency bring something special to your project!
From 132,819 reviews, clients rate our Data Extractors 4.9 out of 5 stars.Data Extraction is the process of extracting data from a variety of sources for further analysis. A Data Extractor is someone who helps businesses and organizations gain insight from their data and create descriptive and predictive models. They specialize in finding patterns and relationships that guide decisions and uncover meaningful information. Through carefully crafted queries and processes, our Data Extractors can transform raw data into a useful format that can be used for reporting, analytics, machine learning and more.
Here's some projects that our expert Data Extractors made real:
When you partner with an experienced team of Freelancer's Data Extractors you can access valuable insights from your data that can guide decisions, uncover opportunities and create predictive models with new data sources. Our experts can help you unlock deeper insights with advanced filtering methods and complex coding. Explore the full range of possibilities with our talented community of professionals, capable of delivering comprehensive solutions tailored to your needs.
Ready to launch your very own project on Freelancer.com? We invite you to try us out and hire our experienced Data Extractors to make your design goals a reality. Let their creativity, skill, and proficiency bring something special to your project!
From 132,819 reviews, clients rate our Data Extractors 4.9 out of 5 stars.I need a robust script that automatically collects structured data from a set of public-facing websites I will share once the work begins. The goal is to pull the required fields, normalise them, and hand them back to me in clean CSV or JSON files while keeping the code easy to maintain and rerun. Category Item Name Size (ML/L) Quantity ABV Price Bottle/Can Deposit Image URL Only 600 data Budget $5
I have a set of existing spreadsheets filled with plain text records that now need to live in clean, well-structured CSV files. Your task is to pull every row from those sheets, check that the headings stay consistent, and export the results as UTF-8 CSV without introducing any hidden characters or broken line breaks. The spreadsheets are already organised, so no deep data cleansing is required—just tidy up obvious spacing issues, preserve punctuation, and make sure every cell ends up in the correct column position in the final CSV. I will share the sheets in either Google Sheets or Excel; work in whichever environment you prefer as long as the finished deliverable is a set of ready-to-import CSV files. Deliverables • One CSV per source sheet, correctly named and enc...
I have a newly developed AI-powered document and identity verification platform that is ready for functional testing. The core focus of this testing will be validating the end-to-end workflow, including document upload, user authentication , data extraction, verification processes, and secure document retrieval. Users will be interacting with the system by uploading documents, completing identity verification steps, accessing restricted environments, and retrieving verified outputs. Your role is to design realistic test scenarios that reflect real-world use cases (e.g., visa applicants, students, compliance officers), execute them thoroughly, and ensure that each workflow behaves as expected. You will need to validate that: Documents are uploaded and processed correctly Identity verific...
More details: Is this project for business or personal use? For an existing business What information should successful freelancers include in their application? Detailed project proposals How soon do you need your project completed? ASAP
I am looking for an experienced Apify specialist to execute a highly targeted, small-scale data extraction project. The goal is to collect 200–300 high-signal records from specific professional discussion forums and community platforms. This is not a complex web development project. I have already created a an execution plan. Your job is to configure the tools in my workspace, execute the runs cleanly, and normalize the exported data. Scope of Work: Task Setup: Configure specific Apify actors (primarily Web Scraper and Reddit scrapers) directly within my Apify workspace so I retain the assets. Execution: Run pre-defined search queries (which will be provided upon hire) across multiple platforms. Strict Filtering: Apply post-run filters to the dataset. For example, on certain pla...
I need a robust web-scraping solution that automatically collects product information from several e-commerce websites. The focus is on two key data points: • Product name and full description • Customer reviews and ratings Price and availability are not required this time, so the crawler can ignore any endpoints related to stock or cost. Please build the script so I can run it on demand and easily point it at new store URLs in the future. Python with BeautifulSoup, Scrapy, or a similar framework suits me fine, as long as the code is clean, well-commented, and leverages polite scraping practices (respectful delays, user-agent rotation, handling captchas when possible). Deliverables: 1. Working scraper code with clear setup instructions 2. Sample CSV or JSON export c...
More details: What type of data do you need to extract from the PDF? Text What information should successful freelancers include in their application? Experience What format do you want the extracted text in? Excel sheet
I need an automation solution to streamline several tasks involving data extraction, document generation, and data transfer. Requirements: - Extract Quantities, Prices, and Descriptions from an MS Spreadsheet BOQ. - Populate Cost Breakdown Sheets with the extracted data. - Generate Word documents: Work Order Forms and Installation Guides. - Transfer all relevant data to GoldVision CRM and MS Business Central. - Save all files in a newly created SharePoint folder. Ideal Skills: - Proficiency in MS Excel and Word. - Experience with automation tools (e.g., Power Automate, Zapier). - Familiarity with GoldVision CRM and MS Business Central. - Strong organizational skills for managing file storage in SharePoint. Looking for someone with proven experience in similar automation tasks. Please p...
I have a set of existing spreadsheets filled with plain text records that now need to live in clean, well-structured CSV files. Your task is to pull every row from those sheets, check that the headings stay consistent, and export the results as UTF-8 CSV without introducing any hidden characters or broken line breaks. The spreadsheets are already organised, so no deep data cleansing is required—just tidy up obvious spacing issues, preserve punctuation, and make sure every cell ends up in the correct column position in the final CSV. I will share the sheets in either Google Sheets or Excel; work in whichever environment you prefer as long as the finished deliverable is a set of ready-to-import CSV files. Deliverables • One CSV per source sheet, correctly named and enc...
Government portals that publish public records but do not offer bulk-download options. I need an automated solution that can search by number on this page and download each file in its native PDF form. Here is what I am after: • A repeatable scraper—Python capable of searching in specific domain, following pagination, and collecting accessible PDF link. • The script should save the PDFs locally in a clear folder structure (site / year / category). • A simple log or CSV report listing the URL, document title, and download status for every file processed. Acceptance criteria 1. All public records published in the specified date span are present as intact PDFs. 2. The log matches the count of files actually downloaded. Please make sure the code is well c...
We are looking for a detail-oriented freelancer to extract and enter structured data from scientific papers into a standardized Excel workbook. This is not basic data entry. You will be reading research papers, identifying relevant quantitative data, and recording it accurately in a multi-sheet template. Scope of Work Follow a defined workflow to: Review and screen scientific papers Extract quantitative data (e.g. duration, intensity, prevalence, efficacy) Enter data into a structured Excel file, including: Screening decisions Paper metadata Detailed parameter extraction tables Verify all values against source PDFs (numbers, units, timepoints) Clearly document decisions and flag uncertainties Deliverables Completed Excel workbook per module Accurate, consistent, and audit-ready data Cle...
I have a collection of Word documents that hold nothing but numbers—mostly tables of sales figures, inventory counts, and a few one-column lists. I need every single value lifted out of those digital files and placed accurately into a clean, well-structured Excel workbook. The job is straightforward but detail-sensitive: • Open each Word document and extract the numbers exactly as they appear. • Paste or type them into the matching rows and columns of the Excel template I will provide. • Keep original number formatting (decimals, commas, negatives) intact. • Double-check totals with simple SUM formulas so we can spot any discrepancies instantly. When you are done, I expect one Excel file that mirrors the order of the source documents, plus a brief note of any ir...
My Excel workbook already contains a VBA macro that opens a PDF, extracts targetted numeric value from certain columns, aggregates them, and drops the results straight into specific cells. Functionally it works, yet it generates errors when it comes accross different number formats. I need a fast, tidy rewrite (or smart port) that does the same three core steps—read, parse & aggregate, write—within roughly three hours of coding time. You can choose the approach that lets you move fastest: streamline the existing VBA, replace it entirely with a Python routine built around pdfplumber, or create a hybrid where Python performs the heavy lifting and VBA simply updates the sheet. I’m comfortable with any of those paths as long as the final workbook remains a one-click solu...
I need someone to take my monthly purchase history from my Home Depot Pro account, currently in PDF format, and upload it into my business Excel spreadsheets. The data needs to be organized as follows: - Dollar amount - Credit card number - Date - Brief description Ideal skills and experience: - Proficient in Excel - Experience handling and entering data from PDFs - Attention to detail and data accuracy - Trustworthy with sensitive information
I need to compile a clear, reliable picture of how female freelancers are distributed, what skills they market, typical earnings ranges, and any notable growth patterns across major regions. The job starts with data extraction: pull publicly available information from leading freelancer platforms, professional networking sites, and other open repositories. After cleaning and de-duplicating the records, the next step is to analyse the dataset—producing descriptive statistics, trend plots, and concise written commentary that highlights regional hot-spots, in-demand skill sets, and any gaps or opportunity areas. Please deliver: • A CSV or Excel file containing the raw and cleaned datasets, accompanied by a short data-dictionary. • An analytic report (PDF or slide deck) that...
I have a continuous flow of text-based records sitting in Excel spreadsheets that need to be moved into our proprietary app and reorganised exactly to spec. The task is straightforward: copy each value from the sheet, paste it into the matching field inside the program, and carry out the simple “Restructure Data into our App” step that appears after every paste. I provide a clear, click-through guide—no prior technical experience is necessary as long as you are comfortable using a computer and can follow written instructions with care. What matters most is accuracy. Each batch is 5,000 entries, and I review them for consistency before releasing payment. You will earn $60 for every fully completed, error-free batch, and there is always another file waiting if your results...
I need a skilled web scraper to gather phone contact information for sales inquiries. This data will be used primarily for leads generation. Key requirements: - Scrape phone contacts from specified sources
I need a skilled developer to extract data from our field service software, Eworks, using their API. The extracted data will be used in Excel. Requirements: - Data Types: Service reports, customer information, and fieldworker schedules. - Structure: Data should be organized in Excel using pivot tables for aggregated views. - Update Frequency: Data needs to be refreshed and updated daily. Ideal Skills and Experience: - Experience working with APIs, particularly Eworks. - Proficiency in Excel, especially in creating and managing pivot tables. - Ability to set up automated data extraction and updates on a daily basis.
We have a web research project. We have a list of study programs of universities in which we have to search the study programs from the university website and their contact information.
I have a set of blood-test reports that arrive as PDFs, and I need an accurate, repeatable way to extract only the test result section from each file. The patient demographics and doctor’s notes can be ignored; my focus is strictly on the numerical results, reference ranges, and units. Here’s what I’m looking for: • A lightweight script or small desktop tool (.NET Core + Tesseract, AWS Textract, or any engine you prefer) that ingests multi-page PDF blood panels and returns structured data—CSV or JSON is fine. • Clear mapping of the extracted fields to their respective test names as they appear in the PDF. • Reliability across differing lab layouts; most follow similar tables, but spacing and fonts vary. Acceptance criteria 1. Feed a sample...
We need a robust yet lightweight script that can automatically pull business details from a publicly accessible government website. The information to capture will centre on business details, such as registration numbers, business name, registration date, address, etc. The workflow should: • Navigate every relevant section of the site (pagination, search filters, subsidiary pages). • Extract the required fields accurately • Export clean, structured data to CSV and JSON A Python solution leveraging requests/BeautifulSoup or Scrapy is preferred, but I’m open to other dependable stacks if they handle rate-limits, retries, and potential CAPTCHA gracefully. The script must be easy to rerun on demand, with clear instructions for environment setup and any dependencies...
I need support with custom ABAP development focused solely on building ALV Reports inside our S/4 HANA system. The functional specs are ready; what I’m missing is a clean, well-structured program that: • pulls data from both standard and custom tables, • follows best practice performance techniques (field symbols, hashed tables, proper buffering), • presents the output in an interactive ALV grid with sorting, filtering, subtotaling and user-specific layouts, and • comes with inline documentation and basic test data so I can run it immediately after transport. All coding must comply with SAP naming conventions, be fully transportable, and avoid obsolete statements. I work with Eclipse ADT and SE80, so please develop in a way that runs flawlessly in either e...
I need a skilled web scraper to gather phone contact information for sales inquiries. This data will be used primarily for leads generation. Key requirements: - Scrape phone contacts from specified sources - Ensure data accuracy and up-to-date information - Deliver data in a structured format (e.g., CSV, Excel) Ideal skills and experience: - Proficiency in web scraping tools and techniques - Experience with data validation and cleaning - Attention to detail and ability to meet deadlines Looking forward to your proposals!
I have a batch of PDFs that contain pure text—no complex tables or images— and I need every line transferred accurately into an Excel spreadsheet. The task is straightforward: open each PDF, copy the text exactly as it appears, and paste it into the corresponding rows or columns I’ll specify. Consistency in spacing, punctuation, and line breaks is important because the spreadsheet will feed directly into another system once complete. Deliverable: • One clean, well-formatted Excel file containing all copied text from the supplied PDFs. I’m ready to send the first set of files as soon as you confirm you can start, and I’ll be on hand to answer any layout or formatting questions along the way.
- Data Processing Associate - Backend Executive - Sports Data Updating - Data Feed Analyst - Account Executive (Tally+GST+Excel)
I have around 7,000 Aliexpress products that I need fully harvested for content-creation purposes. From each listing I only require the official product photos and any product videos—no customer review photos. Everything should come back to me in ready-to-use JPEG for images and MP4 for videos, preserved at the highest resolution Aliexpress serves. Please pull all files directly from the product gallery and video carousel, avoiding watermarks or compression wherever possible. A light file-name convention that ties each asset to its product URL or SKU will make downstream editing much easier for me. Deliverables: • Folder structure or archive segmented by product (one folder per listing). • Inside each folder: all JPEG images and any MP4 videos found. • A simple C...
i have several thousand real estate listings. I would like a spreadsheet that takes a link to each listing. You should then put the address in google maps, and search for the nearest locations to the property (eg, nearest supermarket) and record how far it is in the spreadsheet. I will provide a url to the results (which are not in English, so you will need to translate the results). From this url there will be several thousand results. You must check the address for each, and search for 5 nearby locations, and record the distance in google sheets. You should also paste 2-3 piece of information from the listing description, for a total of around 8 - 10 columns. It is fine to do this by hand, or use an AI agent (so long as you verify the results). I would also be happy to pay someone to t...
I need help gathering the latest batch of survey responses for my existing business so they can be stored and analysed in Excel. You will access our online questionnaire backend, download each completed response, and ensure every answer is captured accurately in the master spreadsheet I provide. The focus is squarely on data collection—no content rewriting or analysis required—just precise transfer of each response into the correct row and column. A quick eye for detail is essential because the survey contains both multiple-choice items and a few short text fields that must remain exactly as submitted. Once all entries are in place, I will spot-check a sample for accuracy before signing off. If you are meticulous with spreadsheets and comfortable handling confidential cus...
A 15-page paper survey (well over 500 pages in total) needs to be transferred into the Excel template I have already set up. The template’s columns are clearly defined for each field, separating numeric answers from open-ended text responses, so the structure you will follow is fixed. You will: • Read each survey page in its original, non-English language and capture every response exactly as written—accents, special characters, punctuation and all. • Enter numeric values in the numeric columns and paste or type verbatim text answers into the designated text columns. • Keep the row order identical to the order of the paper forms so cross-checking remains simple. Quality matters more than speed. I will spot-check the first batch you return; an overall accur...
I am looking a developer who can write a code or script to fill a website form in zero second and submit the same, it s normal form of details to book a playground so first person who submit can get the booking, people already doing this to submit it but we want to submit it before them, I can pay whatever amount it suitable for this work.
I need every garden centre listed at pulled into a clean spreadsheet. Please visit each profile and capture its phone number, email address and full physical address exactly as shown on the site. Create one row per centre with separate columns for: • Garden-centre name • Phone • Email • Street address • Town/City • Postcode If an item is missing, leave the cell blank rather than guessing. Deliver the file as an Excel so I can quickly filter and sort the data later. Accuracy and consistency matter more than speed, so double-check spellings and number formatting before handing it over.
I have two nearly identical Excel workbooks—each with approximately 300 rows and columns A through AE—that need to be merged into a single, clean master sheet, which is already established. The goal is to streamline our data management by ensuring every record is transferred accurately and all duplicates are eliminated using an automated duplicate-checking tool, rather than manual review. The finished file must reach me by Wednesday, 16 April 2026, midday AET.
Reply back. i have a Propstream Account I need a clean, ready-to-use lead file built from two sources—Propstream and Apollo.io. From Propstream, the focus is commercial properties and apartment assets; every record must include full property details plus up-to-date owner contact information. will serve as a second source so the final list contains additional prospects pulled from that platform (matching or related to the same asset classes). All data must be scraped, deduped, and formatted in a single spreadsheet so I can sort, filter, and launch campaigns immediately. Use whatever stack you prefer—Python, Selenium, BeautifulSoup, Apify, or similar—but the workflow has to respect each site’s TOS and deliver reliable results. Deliverables • CSV/Excel...
I have a list of roughly 1500 URLs—each coming from the same automotive website—that together cover the top 100 makes, models, grades and variants sold in Australia. I need every data point the site makes available for each of those vehicles, from the obvious specs such as year, make, model and variant right through to driveway prices, engine type, transmission, drive configuration, warranty details, fuel-economy figures, in-car technology features, seating layouts and any other attributes exposed on the page. The end goal is a clean, analysis-ready Excel workbook that lets me run market-wide comparisons, so consistency is critical: headings must be standardised, units normalised and categorical values written the same way across the entire sheet. I am happy for you to use P...
I need a reliable script that can pull live pricing details for car rentals from both the rental companies’ own sites and the big aggregator platforms in one pass. The goal is to feed it pickup / drop-off locations, dates, and driver age, then receive a clean CSV or JSON that lists vehicle class, daily and total price, currency, taxes & fees, and the URL it was scraped from. The scraper has to: • Navigate each target site automatically, including date pickers and location selectors. • Rotate user-agents / proxies or apply any other anti-bot tactics necessary to stay undetected. • Capture and log errors so a failed request never silently drops a row. • Be easy for me to rerun on demand—command-line or small web UI is fine, as long as setup is s...
Job Title: Freelance Data Extractor / Virtual Assistant for Educational Materials Project Description: I am currently organizing extensive study materials and need a meticulous freelancer to help extract and map specific information from large documents. Your Responsibilities: Review Materials: You will be provided with comprehensive subject-wise PDFs and a corresponding "blueprint" document for each subject. Keyword Matching: Use the specific keywords listed in the blueprints to search through the PDFs. Data Extraction: Extract the exact topics, paragraphs, or sections from the PDFs that align with those keywords. Formatting: Compile the extracted information into a structured, easy-to-read format directly into my Notion workspace (or a standard Google Doc/Word file), organized ...
I have a spreadsheet (~5,000 product rows) where each row contains a full HTML eBay listing template. Each row includes: ID SKU Description Short description The Description field contains a large block of HTML (decorative listing template), but the actual product description is embedded inside it. Your job is to extract the correct text. 1. Extract the Correct Description In every row, the real product description is located inside this HTML block: <div class="desc-rd desc-text"> Requirements: Extract only the content inside this div Ignore all other HTML content in the row (menus, images, headers, shipping info, footer, etc.) Do not use the rest of the HTML outside this block 2. Clean the Extracted Text The content inside the div typically contains HTML such as...
I need you to collect 1,000 product images, titles, and links from Temu. Just these three items. I think this should be pretty straightforward. If you do a good job, I’ll hire you on a long-term basis, as I’ll need someone to help me collect product information from Temu on an ongoing basis.
I’m in the start of a performance-focused data-migration effort that moves our current datasets into Snowflake, with Geneva and straight SQL powering the pipelines. To keep momentum, I need someone who can help on the hands-on analyst work while thinking like a Business Data Analyst. What I still have to finish centres on two areas: • Data extraction and transformation – mapping existing schemas, writing efficient SQL, and using Geneva/Snowflake utilities to cleanse and reshape data so downstream analytics run faster. • Data loading and validation – building repeatable load jobs into Snowflake, designing row-level and aggregate checks, and documenting reconciliation so stakeholders can trust the numbers. Acceptance criteria • Clean, reusable ETL scripts ...
Job Title: Freelance Data Extractor / Virtual Assistant for Educational Materials Project Description: I am currently organizing extensive study materials and need a meticulous freelancer to help extract and map specific information from large documents. Your Responsibilities: Review Materials: You will be provided with comprehensive subject-wise PDFs and a corresponding "blueprint" document for each subject. Keyword Matching: Use the specific keywords listed in the blueprints to search through the PDFs. Data Extraction: Extract the exact topics, paragraphs, or sections from the PDFs that align with those keywords. Formatting: Compile the extracted information into a structured, easy-to-read format directly into my Notion workspace (or a standard Google Doc/Word file), organized ...
Verified Review Platform – Product & Technical Specification 1. Product Overview Platform Type: Verified Review Marketplace Core Idea: Users submit reviews with proof of purchase, which are manually verified before contributing to brand ratings. Key Differentiator: Reviews without proof have limited credibility Verified reviews contribute to ratings and user credibility 2. User Roles Visitor (Not Logged In) Browse brands View reviews Cannot post reviews Registered User Submit reviews Upload proof of purchase Track review status (Pending / Approved / Rejected) Build profile and verified status Admin Approve/reject reviews Verify proof authenticity Approve/reject brand submissions Manage users and reports 3. Core Features A. Authentication System Google OAuth login Phone number l...
Saya membutuhkan bantuan untuk mengumpulkan data berbentuk teks dari sejumlah dokumen atau laporan yang sudah tersedia dalam format Plain Text. Target saya adalah: • Mengidentifikasi bagian-bagian informasi kunci sesuai kriteria yang saya berikan (mis. judul, tanggal, ringkasan, kutipan penting). • Menyalin atau mengekstrak isi tersebut ke dalam spreadsheet atau file CSV terstruktur agar mudah dianalisis lebih lanjut. • Memastikan tidak ada isi yang terpotong, salah ketik, atau hilang selama proses penyalinan. Semua dokumen berada di folder cloud yang sudah saya susun per sub-kategori; aksesnya akan saya bagikan segera setelah kerja sama dimulai. Total dokumen saat ini sekitar 150 file, dengan kemungkinan penambahan batch kecil di kemudian hari. Keterampilan yang saya...
Data entry is an important task, but choosing the wrong solution can seriously harm your company's productivity.
Learn how to hire and collaborate with a freelance Typeform Specialist to create impactful forms for your business.
A complete guide to finding, hiring, and working with a skilled freelance typist for your typing projects.