I need help with my Hadoop exercises
€8-30 EUR
Paid on delivery
Hi!
I have to create three separate Hadoop (Parallel Distributed Processing). It's not a lot so don't expect a high payment.
The exercises consists of:
- Hadoop/HDFS
- Pig
- Spark
Project ID: #31304703
About the project
9 freelancers are bidding on average €26 for this job
Hi, I'm an experienced big data developer. I have worked on set up and management of the distributed environment and have used it for solving big data problems. I can help you on this project. Kindly connect over c More
Hello, Sir! I have enough experience for Hadoop MapReduce programming. Hadoop was already installed and configured on my local machine. And my MapReduce program is a completed project that is using In-mapper combining More
Hi i am working as big data Hadoop admin i will do your project work i have 5.3 years of experience in this industry we will discuss more about the project details in chat.
Hi, I am working in MNC as Data Engineer and currently working on Big Data Fields using PySpark,Hadoop Frameworks and Python library . Having more than 4 years of experience in Big Data Field in production and cer More
Hi, I have expertise in bigdata hadoop, hive, pig, spark with 8+ year's of experience. Please provide more details on the requirement so i can asses and if possible will deliver today itself. Thanks
Greetings, Hope you are doing good!! Kindly share more details on the project, I have around 8+ years of experience working on bigdata technologies. Thanks,
I can get the job done, I have same skill set, hove got 9 + years of experience in this industry. Lets connect.