Big Data Job Transformations
₹400-750 INR / hour
The project involves writing code in Apache Spark for given business logic. The candidate needs to understand the data, business logic, mapping, transformations involved, the architecture.
The project requires knowledge of working with Cloud services and remote servers. If the person knows AWS services like Glue, Athena, then it is a great advantage.
Project ID: #29033446
About the project
11 freelancers are bidding on average ₹638/hour for this job
Hi there, I have been using spark and amazon sagemaker at enterprise level for over 2 years. While I was working on ‘demand forecasting’ problem for a Canadian super chain from our organization, I used spark extensiv More
Hi I am a professional AWS Python BigData Developer with an experience of 7 years. Have a good amount of hands-on knowledge on Pyspark and AWS services like Glue jobs, crawlers, Redshift, Athena, Lambda etc. I woul More
Hi, I am an experienced Data Engineer with solid background in Spark. I have done many Big Data projects with Spark. Let's have a call for more details about the project. regards
AWS glue expert here. AWS glue pySPARK programmer here. Can be done. AWS certified developer here indeed
AWS Certified Data Analytics Specialty, AWS Certified Solutions Architect - Associate and Google Cloud Certified Associate Cloud Engineer with hands-on experience working on a wide range of workloads from POCs to proje More