Your task is to show your software development skills by scraping jobs from a portal and provide them as excel sheet. Also you persist the results into a local database like h2 or derby (if you unaware of these dbs of course mysql or postgres would also be okay)
You need to implement for:
- chrome desktop
- firefox desktop
- run a job search
- navigate through the search result pages
- fetch the data into a model-class structure
- you will get support from our team
Your input of your API functions will be a profile or a project search page
What is NOT needed:
- a UI (not required, implement a JUnit test to call your functions)
- a service architecture (like spring or JEE)
What are our requirements?
- solution work in normal an in headless mode
- liquibase as db management
- your code passes checkstyle, pmd and findbugs (we will share you a git repo with eclipse settings)
- create a model class representing the input of your function
- create a service class implementing the logic
- create a unit test, which tests the service class
- we do NOT need a UI, we only need the model + service method to access the logic via API or via JUnit
- if you need libs beside of selenium or jgrapht, apache commons are fine. Other libs NEED prior clearance
- the runtime is JRE (no JavaEE nor Spring-container)
- delviery in our git
- if you do a good job on supporting our team, we are open to integrate you into regular work
What is our budget?
we do not disclose our budget nor planned hourly rate. Offer us your best bid.
Do not wait for our availability here. Just answer, just ask or just reply.