Delete on database fields in the database. Fix some scraping scripts to ensure the scripts work with the new database structure (e.g. looking for the correct unique variables) and ensure they are working effectively.
Create form to add addresses, then create a cron job to check if address is in database. If so, then no further action required. If not, then retrieve information for each address. To retrieve information, take code from one script that determines whether an X or Y, then combine this with another script that downloads building information and a third script that downloads appropriate information. Modify script new script accordingly to ensure to take into account new database structure. The new form should have a “renew” button, so the user can retrieve updated information. Do simple calculation and determine per search result item the % of eligibility. Then determine the % eligibility for the entire search result. To see the per search result item, there should be a + box next to the entire search result. Email PDF to my email address with the high level search results and % of eligibility information.
Copy a form from an external website. The user will be able to enter information on that form. After the user enters this information and clicks submit, the search results from that site should be scraped and stored into the database. Simultaneously, data from two other sites and two apis should be retrieved and a small calculation is performed. The results should generate a pdf. Then a cron job should be created which links to another script which downloads more information. Then once, these results are calculate, another few calculations will be made, the results should be accessible on a website and another PDF should be generated.