I need a web scraper written for the following url:
[login to view URL]
All information needed is available on the main page. The number of rows will vary.
Only scrape data that includes the word "Pending" in the "LOAD DETAILS" column.
The output should be a pipe (|) delimited file with the following column mappings:
origin_city --> data located in the "PICKUP CITY" column
origin_state --> abbreviation located in the "PICKUP STATE" column
ship_date --> the date from the "PICKUP DATE" column changed to the YYYY-MM-DD format
destination_city --> data located in the "DEST CITY" column
destination_state --> abbreviation located in the "DEST STATE" column
receive_date --> leave blank
trailer_type --> data located in the "TRAILER TYPE" column
load_size --> put text "Full" in the column
weight --> data in the "LOAD WEIGHT" column, if 0 or blank then leave blank
length --> Leave blank
width --> leave blank
height --> leave blank
trip_miles --> leave blank
pay_rate --> leave blank
contact_phone --> leave blank
contact_name --> leave blank
tarp_required --> leave blank
comment --> data located in the "DIMENSIONS" column
load_number --> data located in the "LOAD #" column
commodity --> data located in the "COMMODITY" column
The first line of the output should contain all of the column headers.
Any field that contain no data should be left blank.
Please do not use words like "null" or "blank" in blank columns.
Below is a sample output of the first 5 columns using sample data:
The deliverable will be a Perl .pl file that must run on
Ubuntu Linux and must use Modern::Perl. The Perl .pl file
should be called '[login to view URL]' and the output file should be
called '[login to view URL]'
It will be scheduled in cron to run unattended every 15 minutes.
Please specify what language/OS/modules you plan to use.
Also, please include the word "raccoon" in your bid so I know that
you read this description.
16 freelancers are bidding on average $178 for this job
raccoon !!! Dear I am an expert in web scraping. In before, I developed many spiders using scrapy. For example, I did scrape sites of products for CEOs(ebay, welivv and so on). Thanks.