Find Jobs
Hire Freelancers

Parse Wikipedia Database Dump

$100-300 USD

In Progress
Posted over 16 years ago

$100-300 USD

Paid on delivery
Wikipedia Database Dump project: 1. Parsing [login to view URL] files and extracting only unique domain names. Domain should not be wikipedia.org. 2. Script should work with a file from [login to view URL] 3. Two params (filename -> [login to view URL] database settings -> sql table to insert eh urls id, domain). All params should be in the begining of the file so we can customize ourselves. 4. Software should run on Linux and use regex or other parsing technique. Example Domain Extraction: [login to view URL] -> extract [login to view URL] [login to view URL] -> extract [login to view URL]
Project ID: 213474

About the project

3 proposals
Remote project
Active 16 yrs ago

Looking to make some money?

Benefits of bidding on Freelancer

Set your budget and timeframe
Get paid for your work
Outline your proposal
It's free to sign up and bid on jobs
Awarded to:
User Avatar
pls check PMB.
$100 USD in 5 days
5.0 (165 reviews)
6.3
6.3
3 freelancers are bidding on average $133 USD for this job
User Avatar
Will be done fast and by requirements. Thanks!
$200 USD in 2 days
4.9 (399 reviews)
7.5
7.5
User Avatar
Hi, I have previous experience with this, I can write the program you need. Regards, Stefan
$100 USD in 5 days
4.7 (2 reviews)
2.0
2.0

About the client

Flag of BULGARIA
Mahe, Bulgaria
5.0
1016
Payment method verified
Member since Feb 9, 2007

Client Verification

Thanks! We’ve emailed you a link to claim your free credit.
Something went wrong while sending your email. Please try again.
Registered Users Total Jobs Posted
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Loading preview
Permission granted for Geolocation.
Your login session has expired and you have been logged out. Please log in again.