As part of the team, you will be responsible for building and running the data pipelines and services that are required to support business functions/reports/dashboard..
We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP.
As a Data Engineer you ll be:
Developing end to end ETL/ELT Pipeline working with Data Analysts of business Function.
Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture
Mentoring Fother Junior Engineers in the Team
Be a go-to expert for data technologies and solutions
Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges
Troubleshooting and resolving technical issues as they arise
Looking for ways of improving both what and how data pipelines are delivered by the department
Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports
Owning the delivery of data models and reports end to end
Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future
Working with Data Analysts to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other delta loading approaches
Discovering, transforming, testing, deploying and documenting data sources
Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practices, and peer review
Building Looker Dashboard for use cases if required
WHAT WE ARE LOOKING FOR:
- You have 4+ years of extensive development experience using snowflake or similar data warehouse technology
- You have working experience with dbt and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Fivetran, AWS, git ,Looker
- You have experience in agile processes, such as SCRUM
- You have extensive experience in writing advanced SQL statements and performance tuning them
- You have experience in Data Ingestion techniques using custom or SAAS tool like fivetran
- You have experience in data modelling and can optimise existing/new data models
- You have experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets
- You have experience architecting analytical databases (in Data Mesh architecture) is added advantage
- You have experience working in agile cross-functional delivery team
- You have high development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment
- You have strong technical documentation skills and the ability to be clear and precise with business users
- You have business-level of English and good communication skills
- You have basic understanding of various systems across the AWS platform ( Good to have )
- Preferably, you have worked in a digitally native company, ideally fintech
- Experience with python, governance tool (e.g. Atlan, Alation, Collibra) or data quality tool (e.g. Great Expectations, Monte Carlo, Soda) will be added advantage
Our Tech Stack:
- DBT
- Snowflake
- Airflow
- Fivetran
- SQL
- Looke