- Responsible to provide fast and reliable data solutions for warehousing, reporting, Customer- and Business Intelligence solutions.
- Loading data from various systems of record into our platform and make them available for further use.
- Automate deployment and test processes to deliver fast incremental improvements of our application and platform.
- Implement data governance and protection to adhere regulatory requirements and policies.
- Transform and combine data into a data model which supporting our data analysts or can easily consumed by operational databases.
- Maintain hygiene, Risk and Control and Stability at to core to every delivery.
- Be a role model for the team.
- Work in an agile setup, helping with feedback to improve our way of working
Commercial Banking Tribe
- Youll be joining the Commercial Bank Tribe, who is focusing on the special needs of the small and medium enterprise clients in Germany, a designated area for further growth and investment within Corporate Bank.
- We are responsible for the digital transformation of ~800.000 clients in 3 brands, i.e. the establishment of the BizBanking platform including development of digital sales and service processes as well as the automation of processes for this client segment.
- Our tribe is on a journey of an extensive digitalisation of business processes and to migrate our applications to the cloud. On that we are working jointly together with our business colleagues in an agile setup and collaborating closely with stakeholders and engineers from other areas thriving to achieve a highly automated and adoptable process and application landscape..
Your key responsibilities
- Design, develop, and deploy data processing pipelines and data-driven applications on GCP
- Write and maintain SQL queries and use data modeling tools like Dataform or dbt for data management.
- Write clean, maintainable code in Java and/or Python, adhering to clean code principles.
- Apply concepts of deployments and configurations in GKE/OpenShift, and implement infrastructure as code using Terraform.
- Set up and maintain CI/CD pipelines using GitHub Actions, write and maintain unit and integration tests.
Your skills and experience
- Bachelor's degree in Computer Science, Data Science, or related field, or equivalent work experience.
- Proven experience as a Data Engineer or Backend Engineer or similar role.
- Strong experience with Cloud, Terraform, and GitHub Actions.
- Proficiency in SQL and Java and/or Python, experience with tools and frameworks like Apache Beam, Spring Boot and Apache Airflow.
- Familiarity with data modeling tools like Dataform or dbt, and experience writing unit and integration tests.
- Understanding of clean code principles and commitment to writing maintainable code.
- Excellent problem-solving skills, attention to detail, and strong communication skills.
Employment Type: Full Time, Permanent
Read full job description