You will be involved in one of the biggest data transformation journeys within our organization. As a data engineer, you will be working with building data products in the context of the Data Mesh concept based on a defined target vision and requirements.
We appreciate a multitude of technical backgrounds, and we believe you will enjoy working here if you are passionate about data. In this role, you will be required to implement data-intensive solutions for a data-driven organization.
You will join the Data Engineering Competence area within AI (Artificial Intelligence), Analytics & Data Domain, and be an individual contributor in one of the data product teams. The area supports all our brands globally to create, structure, guard, and ensure data is available, understandable, and of high quality.
Requirements
Experience in data query languages (SQL or similar), BigQuery, and different data formats (Parquet, Avro).
Take end-to-end responsibility for designing, developing, and maintaining the large-scale data infrastructure required for machine learning projects.
Have the DevOps mindset and principles to manage CI/CD pipelines and terraform as well as Cloud infrastructure, in our context, it is GCP (Google Cloud Platform).
Leverage the understanding of software architecture and software design patterns to write scalable, maintainable, well-designed, and future-proof code.
Work in a cross-functional agile team of highly skilled engineers, data scientists, and business stakeholders to build the AI ecosystem within our organization