Setup Airflow on Google Kubernetes Engine cluster:
You need to have Airflow deployement experience and Google Kubernetes Engine experience
- Airflow 1.10.2 or newer
- Mysql DB 5.7
- Executor: KubernetesExecutor
- 1 cluster
- 1 node (to be confirmed)
I want to use pure Airflow and open source libraries. I want to have the option to move the whole project to another provider, so I prefer not to use GKEPodOperator and similar Google specific things unless it is the only option.
Prepare necessary configuration files (yaml, docker-compose, helm,..., airflow config file) + full instruction to deploy airflow on GKE (assume 2 options: cluster exists or new cluster, so instructions should include cluster creating and setup on GKE) with Mysql DB (docker file will be provided by me) and KubernetesExecutor. Need to have persistence, logging, Ingress controller/LoadBalancer, port exposure inside and to outside. I should be able to access the airflow webserver to manually run dags
- 1 pod will contain the airflow webserver and Airflow scheduler and the mysql database (with persistence, Stateful Set) and mongoDB container (docker file will be provided by me ) or use 2 pods (one for airflow and one for Mysql and MongoDB)
- 1 pod for every task instance. Workers are created in containers dynamiccaly and disappear when the task/dag is finished
- DAG sync mode: PersistentVolume
- 3 PersistentVolume: 1 for logs, 1 for Dags and plugins, 1 for databases (mysql and mongoDB) or 1 persistent volume with 3 sum folders (one for each)
- Ability to trigger DAG runs with Airflow REST API
- Ability to send variables (in json/dict format) and data (in pandas Dataframe format) when calling a Dag or between tasks inside a dag
- ability to access the Airflow Webserver thought the internet (authentication required)
Create a sample dag with 3 tasks (1 python operator, 1 bash operator and 1 KubernetesPodOperator) to demo the process (should use XCOM and a sample code in plugin to demo how to import external python scripts into a dag)
Create instructions and code to test the Dag (trigger it) using Airflow REST API and from some python code running in a container deployed on GKE (Will be used in Part C)
Create a new DB inside Mysql ("myUsers) to hold users data and user management: need 3 types of users: basic, premium and admin. create one user for each type. sql schema to be provided by me.
Create and deploy (statefulSet, port exposure) a sample Flask server "[login to view URL]" in a docker container with routes:
- "login" to verify user against DB myUsers
- "addUser" and "deleteUser" available only for admin
- "saveFile" to store a file in "toProcess/[username]" folder under the PersistentVolume (using FTP)
- "trigger_dag" (variable "dagName") that will trigger a specific dag either by using Airflow Rest API or using python/bash scripts (both should be supported, the one to use will be a setting/variable inside "[login to view URL]" )
No HTML files to be returned from this Flask server, only data (in json format) and return code.
Create a simple python code ([login to view URL]) that uses python requests to test all functions of airflowFlaskApp. It can be run from anywhere (my pc for example). It should test saveFile with a csv or excel file.
Other details and info required will be discussed as needed
All code should be documented (functions should have comments explain all variables and return values, and main part of the code).
Python 3.6+ should be used
All python code should have [login to view URL] using pipreqs
Instructions should include how to update code without stopping the server (on GKE)
All access to the internet (outside the cluster) should be secure: need to create necessary secure connections and instructions to setup (certificates needed)
other skills required: Airflow, Flask, Docker, Kubernetes, Google Kubernetes Engine, MYSQL
9 freelancers are bidding on average $144 for this job
Please click on the "CHAT" button so that I can ask a few questions concerning your project. I will provide an exceptional quality project on time, leaving you fully satisfied that you got your money’s worth!