I need a web scraper written for the following url:
[url removed, login to view]
Login is required. Login credentials for Fox Lumber available loads:
username: steve@[url removed, login to view]
All information needed is available on the main page. The number of rows will vary.
The output should be a pipe (|) delimited file with the following column mappings:
origin_city --> data located in the "Loc City" column
origin_state --> data located in the "Loc State" column
ship_date --> data located in the "Ready" column changed to the YYYY-MM-DD format
destination_city --> data located in the "Dest City" column
destination_state --> data located in the "Dest State" column
receive_date --> leave blank
trailer_type --> put the text "Flatbed" for all
load_size --> put the text "Full", if the "Load Comments" column contains the word "Partial" put
the text "Partial"
weight --> leave blank
length --> Leave blank
width --> leave blank
height --> leave blank
trip_miles --> leave blank
pay_rate --> leave blank
contact_phone --> put the text "406-363-5140"
contact_name --> data located "Dispatcher" column
tarp_required --> leave blank
comment --> data located in the "Load Comments" column, if "Load Comments" column contains the word "van" add the text "van" to the trailer_type data. If "Load Comments" column contains the word "maxi" add the text "maxi" to the trailer_type data
load_number --> leave blank
commodity --> leave blank
The first line of the output should contain all of the column headers.
Any field that contain no data should be left blank.
Please do not use words like "null" or "blank" in blank columns.
Below is a sample output of the first 5 columns using sample data:
The deliverable will be a Perl .pl file that must run on
Ubuntu Linux and must use Modern::Perl. The Perl .pl file
should be called '[url removed, login to view]' and the output file should be
called '[url removed, login to view]'
It will be scheduled in cron to run unattended every 15 minutes.
Please specific what language/OS/modules you plan to use.
Also, please include the word "raccoon" in your bid so I know that
you read this description.
12 freelancers are bidding on average $180 for this job
raccoon Hello! I can make you a Python (not Perl) script to get data from foxlumber and put it into pipe delimited file. The script will use Python Selenium webdriver module.