Hadoop facilitates solving problems with huge numbers of data in many business applications. Thanks to Freelancer.com, Hadoop experts can now find many related jobs on the internet to earn some extra cash.

Hadoop is typically a program that is under the Apache licensing and it is one of the most popular open-source software frameworks today. This program works by making it possible for other programs to break down data into petabytes. Hadoop jobs solve complicated problems involving big data numbers that can be complex or structured or a combination of both. Hadoop jobs require a deep understanding of analytics skills, particularly clustering and targeting. These jobs can also be applied in other fields, in addition to computers.

If you are a Hadoop expert seeking to go online, then Freelancer.com is right for you. This is a job-posting website, matching freelancers with jobs in their particular professions. The site is also providing a wide range of Hadoop jobs and just as with others, these come with several benefits. Perhaps the greatest boon is the impressive rates for the jobs. The fact that hundreds of Hadoop jobs are posted on Freelancer.com 24/7 is also assuring the ease of the hiring process.

Hire Hadoop Consultants

Filter

My recent searches
Filter by:
Budget
to
to
to
Skills
Languages
    Job State
    125 jobs found, pricing in USD
    setup hadoop cluster 5 days left
    VERIFIED

    I need a working apache hadoop cluster setup for my project.

    $89 (Avg Bid)
    $89 Avg Bid
    9 bids

    I need screen shots of hibench benchmarks for hadoop and spark for wordcount and run spark sum of integers file

    $103 (Avg Bid)
    $103 Avg Bid
    5 bids
    Tutor for Basics 4 days left

    I'm looking for a tutor or a hadoop admin who can teach me basics about -- Hadoop( Hdfs, MapReduce, Hive, Hue, Yarn, Spark, Kafka, Cassandra, Mongo, Linux, DBA, Java, Networking, Active Directory, TLS, Encryption) . I don't need very deep insights I just need outline and someone who can answer patiently all my questions.

    $6 / hr (Avg Bid)
    $6 / hr Avg Bid
    13 bids

    I am a hadoop consultant and from time to time would like to get trouble shooting on an hourly basis. The skills i am looking for are Hadoop installation, AWS setup, Spark , Hive, Spark SQL, in

    $156 (Avg Bid)
    $156 Avg Bid
    13 bids

    The project idea should be innovative and new.

    $88 (Avg Bid)
    $88 Avg Bid
    4 bids

    code execution on AWS Elastic Map Reduce .

    $29 (Avg Bid)
    $29 Avg Bid
    5 bids

    This are my steps to include: I have to include Json file from S3. File contains uid,ag,gn,yob,scrtp attribute. I have to make key as uid and value as ag,gn,yob,scrtp value and that is included in HashMap collection. Reducer will reduce it according to key,hash value can also be calculated based on key as I have 5 database in MongoDB. What I am expecting result to be write into 5 Mongo database according to hash value and write should be bulk write, how to achieve this as above steps? Need a person who can do this in Hadoop Map Reduce in Java?

    $4 / hr (Avg Bid)
    $4 / hr Avg Bid
    6 bids

    I need a real-time Hadoop developer For project explanation

    $26 (Avg Bid)
    $26 Avg Bid
    7 bids

    I need a real-time Hadoop developer For project explanation

    $98 (Avg Bid)
    $98 Avg Bid
    7 bids

    i want to apply knn algorithm on the input files attached using hadoop, map reduce.

    $27 (Avg Bid)
    $27 Avg Bid
    1 bids

    i want to apply knn algorithm on the input files attached using hadoop, map reduce.

    $30 (Avg Bid)
    $30 Avg Bid
    4 bids

    Video Course on Big Data Hadoop. It would be screen recording and voice over. Should cover the course using practical examples. The recording will be approx 10 hrs

    $533 (Avg Bid)
    $533 Avg Bid
    11 bids

    Hi, I need a small project written in java. To clean data and make a prediction on an attribute with the given attributes using algorithms.

    $83 (Avg Bid)
    $83 Avg Bid
    5 bids

    Hi, I need a Hadoop expert to help create a search engine program for wikipedia data. You must use the wikipedia XML data and then format it, then create a map reduce job and a ranking algorithm. After the job is run, user must be presented with top 10 results. Please bid if you have good hadoop experience. Thanks

    $1010 (Avg Bid)
    $1010 Avg Bid
    10 bids

    By using Java, use [url removed, login to view] as your starting template to process music data, implement the Mappers to parse data from different data sets to generate intermediate outputs, which will then be further analyzed and aggregated by the Reducer to produce desired results.

    $141 (Avg Bid)
    $141 Avg Bid
    18 bids

    This project is to access hadoop services (hdfs,hive,hbase, yarn and impala) from external java program (program runs from outside the hadoop cluster) and automate tasks. Then integrate this project with other applications.

    $18 / hr (Avg Bid)
    $18 / hr Avg Bid
    11 bids

    Query performance improving in bucketing using map reduce or any algorithm with Hive, comparatively should show the implemantation in Hadoop , on any data set ....for example I wanted to create a single bucket for different different buckets to show that improved performance comparatively than the existing system

    $105 (Avg Bid)
    $105 Avg Bid
    10 bids

    Project Description I am looking for a male who is expertSkillset Grid , Hadoop Hive Service, Java EE As a Technology Architect, you will significantly contribute to identifying best-fit architectural solutions for one or more projects; develop design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. You will collaborate with some of the best talent in the industry to create and implement innovative high quality solutions, participate in Sales and various pursuits focused on our clients' business needs. You will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued. • In-depth experience in providing Architecture solutions BigData and Data Management related projects. • Experience in Solution Architecture for delivering end to end solutions in Hadoop platform. • Hands on experience with distributed application architecture and implementation using MapR. • Hands on experience with Hadoop Ecosystem particularly Hive, HDFS ,Spark • Experience in Articulating and designing the security aspects for MapR cluster. • Experience in setting specification and reviewing Disaster Recovery and High Availability set up for Hadoop cluster. • Experience in designing real time and batch ingestion framework. • Experience is setting up the Enterprise level data lake implementations as part of the Ingestion framework • Experience in designing consumption ,compression and storage patterns in MapR. • Knowledge of HBase, MapRDB, MapR FS. • Experience in Performance Tuning and Cluster Size Estimation. • Java/Python experience and Shell scripting experience. • Experience in Big data job management through Oozie • Experience in supporting Pre sales activities. • Excellent Communication and Analytical skills • Experience and desire to work in a Global delivery environment. I will send the job description well in advance. There might be 2 phone interview rounds, and the funds will only be transferred only if the interview is success and the candidate is selected.

    $1228 (Avg Bid)
    $1228 Avg Bid
    14 bids

    Projects on Big Data/ML on AWS using trechnologies such as spark, kafka, hadoop,hive, GraphQL,NoSQl DBs, Migrating DBS to cloud

    $31 / hr (Avg Bid)
    $31 / hr Avg Bid
    26 bids

    This position is responsible for reviewing test cases and other testing artifacts to ensure the accuracy and completeness of all test artifacts. This position coordinates testing, reports and tracks testing problems by providing reproducible test cases, works with other product team members to diagnose and recreate problems as well as prioritize and implement solutions and close problem reports. This position is responsible for test scripting data collection and analysis according to the project plan schedule and ensuring adherence to standard practices and procedures. Required Skill Required • Experience and proficiency in using HP test tools including Test Director/Quality Center 9.0 or higher and Quick Test Professional (QTP) 9.5 or higher • Required • 5 • Years • Experience testing web, GUI, client/server, and database applications. • Required • 5 • Years • Experience with requirements analysis and automated test cases/script development. • Required • 5 • Years • Experience with manual or automated testing, testing tools, writing test plans and reviewing test cases. • Required • 5 • Years • Strong analytical and problem solving skills. • Required • • • Good oral and written communications skills in order to interact on a daily basis with system developers, business analysts, and others. • Required • • • Ability to prioritize tasks within the project and work with minimal supervision or guidance. • Required • • • Ability to work in a rapidly changing environment. • Required • • • Stable work history • Required • • • Experience with testing Curam products. • Highly desired • 1 • Years • Experience testing for a health and human services project. • Highly desired • 1 • Years • Experience with executing SQL to validate or test. • Desired • 6 • Months • Experience navigating on a UNIX platform. • Desired • 6 • Months

    $24 / hr (Avg Bid)
    $24 / hr Avg Bid
    5 bids

    Top Hadoop Community Articles