GCP Data Engineer (6039) Milano/Torino/Roma
other jobs NTT DATA ITALIA S.P.A.
Aggiunto prima 1010 giorni
- Nord-Ovest,Lombardia,Milano
- full-time
- Stipendio non menzionato
Job Description:
NTT DATA, Trusted Global Innovator, è tra i principali player a livello mondiale in ambito IT services. Con più di 139.000 professionisti in oltre 50 Paesi in tutto il mondo, siamo protagonisti e acceleratori della trasformazione digitale offrendo ai nostri clienti soluzioni tecnologiche e innovative progettate su misura.
Il motore di NTTDATA sono le persone, ognuna con la propria unicità, talento ed attitudine. Abbiamo costruito una Smile Working Company in cui la cura, l’ascolto delle persone, il loro benessere e sviluppo delle competenze sono la nostra priorità. Abbiamo creato spazi di lavoro che favoriscono il senso di comunità e lo scambio costruttivo di esperienze. Guardiamo al nostro domani con la stessa passione di ieri e abbiamo bisogno anche del tuo talento!
Responsibilities
A Data Engineer is an IT expert that enables data-driven decision making by collecting, transforming, and publishing data. In NTT Data, a Data Engineer should be able to design, build, operationalize, secure, and monitor data processing systems with a particular emphasis on security and compliance, scalability and efficiency, reliability and fidelity, flexibility and portability. The main mission of a Data Engineer is to turn raw data into information creating insight and business value.
Responsibilities
Build large-scale batch and real-time data pipelines with data processing frameworks in GCP cloud platform
Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management.
Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution
Required Skills
Bachelor’s degree in Computer Science, Computer Engineering or relevant field
At least 2 years’ experience in a data engineering role
Expertise as a software engineering using Scala/Java/Python
Experience in Advanced SQL skillset - preference on using BigQuery
Good knowledge on Google Managed Services as Cloud Storage, BigQuery, Dataflow, Dataproc, and Data Fusion
Experience using workflow management
Good understand of GCP Architecture batch and streaming
Strong knowledge of data technologies and data modeling
Expertise on building modern, cloud-native data pipelines and operations, with an ELT philosophy
Experience with Data Migration / Data Warehouse
Intuitive thinking of how to organize, normalize, and store complex data, enabling both ETL and end users
Passion for mapping and designing ingestion and transformation of data from multiple sources, creating a cohesive data asset
Good understanding of developer tools, CICD etc
Excellent communication, empathetic with end users and internal customers.
Nice-to-have:
Google Cloud Data Engineer Certification
Experience using Big Data echo system Hadoop, Hive, HDFS, Hbase
Experience with Agile methodologies and DevOps principles
-
Il motore di NTTDATA sono le persone, ognuna con la propria unicità, talento ed attitudine. Abbiamo costruito una Smile Working Company in cui la cura, l’ascolto delle persone, il loro benessere e sviluppo delle competenze sono la nostra priorità. Abbiamo creato spazi di lavoro che favoriscono il senso di comunità e lo scambio costruttivo di esperienze. Guardiamo al nostro domani con la stessa passione di ieri e abbiamo bisogno anche del tuo talento!
Responsibilities
A Data Engineer is an IT expert that enables data-driven decision making by collecting, transforming, and publishing data. In NTT Data, a Data Engineer should be able to design, build, operationalize, secure, and monitor data processing systems with a particular emphasis on security and compliance, scalability and efficiency, reliability and fidelity, flexibility and portability. The main mission of a Data Engineer is to turn raw data into information creating insight and business value.
Responsibilities
Build large-scale batch and real-time data pipelines with data processing frameworks in GCP cloud platform
Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management.
Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution
Required Skills
Bachelor’s degree in Computer Science, Computer Engineering or relevant field
At least 2 years’ experience in a data engineering role
Expertise as a software engineering using Scala/Java/Python
Experience in Advanced SQL skillset - preference on using BigQuery
Good knowledge on Google Managed Services as Cloud Storage, BigQuery, Dataflow, Dataproc, and Data Fusion
Experience using workflow management
Good understand of GCP Architecture batch and streaming
Strong knowledge of data technologies and data modeling
Expertise on building modern, cloud-native data pipelines and operations, with an ELT philosophy
Experience with Data Migration / Data Warehouse
Intuitive thinking of how to organize, normalize, and store complex data, enabling both ETL and end users
Passion for mapping and designing ingestion and transformation of data from multiple sources, creating a cohesive data asset
Good understanding of developer tools, CICD etc
Excellent communication, empathetic with end users and internal customers.
Nice-to-have:
Google Cloud Data Engineer Certification
Experience using Big Data echo system Hadoop, Hive, HDFS, Hbase
Experience with Agile methodologies and DevOps principles
-
Numero di lavoro 151168