I am looking for professional who are very familiar with Python and Big data processing
$250-750 USD
Paid on delivery
Looking for an Expert in:
· Build Custom Data Pipelines in Python that Clean, Transform and Aggregate data from many different sources
· Big Data Technologies such as: MapReduce, Hadoop, Spark, HBase, Hive, Elastic Search
· Data Structures and Data Processing Algorithms and Frameworks
· Data Migration, high throughput Data Pipelines
· Massively Parallel Processing using Python Tools
· Ability to analyze Performance Issues in Big Data Environment
· Data Modelling, Data Transfer and Storage, Partitioning, Indexing and caching Techniques
Well experienced with:
· Large Scale Data Modelling from Big Data perspective
· Big Data Structures in Python
· PyData, Anaconda, numPy, PyTables, DataFrames, Jupyter Notebook
· PyHive, PySpark
· JSON/Parquet Data formats
· Real time streaming with either Spark Streaming of Kafka
Good to have:
· Familiarity with PyPi
· Workflow Management Tools such as Luigi, Apache Airflow, Snowflow or similar
Project ID: #14511861
About the project
7 freelancers are bidding on average $546 for this job
Hello, my name is Michael. I represent Ukrainian based IT-company Webbook Inc that provides services in the IT-sphere for international business. We were carefully reviewing the requirements of the job description, so More
Hi, Immediately available! I have 4 years of experience in website design and development . Relevant Skills and Experience I have experience in Web Design, PHP, CSS, Bootstrap, HTML, Word Press, Codeigniter Shopify More