Siamo alla ricerca un Big Data Engineer, che possa inserirsi subito presso un nostro cliente di Milano.
L’azienda cliente è una società quotata alla Borsa Italiana e presente sul mercato con soluzioni innovative dedicate alla digital transformation. Servizi di Cloud, Big Data, Cybersecurity, Machine Learning, Artificial Intelligence, Industry 4.0, Internet of Things, Augmented e Virtual Reality.
Descrizione attività
Il Big Data Engineer è responsabile della raccolta, dell’archiviazione, dell’elaborazione e dell’analisi di ingenti quantità di dati.
L’obiettivo principale sarà quello di scegliere e adottare le soluzioni ottimali da utilizzare per questi scopi, quindi di mantenerle, implementarle e monitorarle.
La risorsa sarà anche responsabile della loro integrazione con l’architettura utilizzata in tutta l’azienda.
Competenze richieste
- Laurea in Informatica, Information Technology o equivalente esperienza tecnica.
- Almeno 3 anni di esperienza professionale.
- Profonda conoscenza ed esperienza in statistica.
- Previa esperienza in programmazione, preferibilmente in Python, Kafka o Java e volontà di apprende nuovi linguaggi.
- Competenze su Hadoop v2, MapReduce, HDFS.
- Buona conoscenza dei Big Data querying tools.
- Esperienza con Spark.
-Esperienza nel processare grandi quantità di dati, sia strutturati che non, inclusa l’integrazione di dati che provengono da fonti diverse.
- Esperienza con NoSQL databases, come Cassandra o MongoDB.
- Esperienza con vari sistemi di messagistica, come Kafka o RabbitMQ
Durata attività: contratto di 12 mesi
Impegno: full time
Modalità di lavoro: da remoto
Data d’inizio: ASAP
Lingue: italiano, inglese è sicuramente un plus
English version
We are looking for a Big Data Engineer, who can immediately join one of our customers in Milan.
The client company is a company listed on the Italian Stock Exchange and present on the market with innovative solutions dedicated to digital transformation. Cloud, Big Data, Cybersecurity, Machine Learning, Artificial Intelligence, Industry 4.0, Internet of Things, Augmented and Virtual Reality services.
Activity description
The Big Data Engineer is responsible for the collection, storage, processing and analysis of large amounts of data.
The main objective will be to choose and adopt the optimal solutions to be used for these purposes, then to maintain, implement and monitor them.
The resource will also be responsible for their integration with the architecture used throughout the company.
Skills required
- Degree in Computer Science, Information Technology or equivalent technical experience.
- At least 3 years of professional experience.
- Deep knowledge and experience in statistics.
- Previous experience in programming, preferably in Python, Kafka or Java and willingness to learn new languages.
- Skills on Hadoop v2, MapReduce, HDFS.
- Good knowledge of Big Data querying tools.
- Experience with Spark.
-Experience in processing large amounts of data, both structured and unstructured, including the integration of data from different sources.
- Experience with NoSQL databases, such as Cassandra or MongoDB.
- Experience with various messaging systems, such as Kafka or RabbitMQ
Duration of activity: 12 months contract
Commitment: full time
Working mode: remotely
Starting date: ASAP
Languages: Italian, English is definitely a plus
Hello, Sir. Big data Expert!
Don’t waste time and come to me.
I read your description carefully.
I have 8+ years of experience with Software Development including Desktop App and Web Development.
I used C++/Java/Python,React, Vue, Experss JS and PHP etc.
I am perfectly suitable for this job.
Please contact me for further discussion.
Thanks.
Hi there
My name is Jax, and I'm a python engineer who comes from china
I'm a crawler engineer in my first 5 years of career, solved kinds of tricky website scrape, and I have extensive experience in selenium and headless chrome driver on Linux,now work as a full-time data engineer at top-sports
below are the technical keywords I have used(1-10):
Python(7)
Mysql(8)
Postgre(6)
Mssql(6)
redis(5)
rabbitmq(6)
kafka(2)
elastic(3)
java(3)
javascript(5)
Hadoop ecosystem(3)
also have some experience with cloud environments like AWS, GCP and ali-yun
I'm applying for a part-time contract(below 20 hour/week), maybe later switch to full-time if both our side want a more further cooperation.
Maybe we can start from small trial jobs to test my skills.
Best,
Jax
Hi, It is easy, I can do on time. I work online, where you can track progress of your project. I have 6 years of experience in development(websites, web applications, mobile apps, desktop applications, I/UX), using PHP, WordPress, Java, Python, JavaScript, ReactJs, Bootstrap, I can start now, please come to chat, to discuss more.
Hey Team,
Hope everything's great!!!
I have read the requirement, and a commitment of 12 months of project. I have a 4 year overall experience on Python, Big Data(HIive, HBase, SQOOP, PySpark) Data Analysis, ETL Migration.
Worked on raw files, different DB( Hive, HBase, SQL Server, MongoDB, implemented ETL fetching data from one source, transform the data run few calculations and store the data in the HIVE DB, later designing the Tableau dashboard for data analysis.
Currently I'm working on Finance data building the framework to load the raw data into SQL Server, MongoDB and also designing the application in Python Django framework.
Let's discuss over the chat for further discussions
Thanks & Regards,
Rahul Toggi
Hi, I am Full Stack Developer and Software Architect with 14 years of Web Application development and Big Data Application experience. Lets gets your work done with quality. I am proficient in Apache Kafka, Elasticsearch, MySQL and Oracle.