9 Talentech Solutions Jobs
Data Engineer - ETL/SQL (8-14 yrs)
Talentech Solutions
posted 7d ago
Key skills for the job
Job Description :
As part of the team, you will be responsible for building and running the data pipelines and services that are required to support business functions/reports/dashboard.. We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP.
As a Data Engineer you'll be :
- Developing end to end ETL/ELT Pipeline working with Data Analysts of business Function.
- Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture
- Mentoring Fother Junior Engineers in the Team
- Be a "go-to" expert for data technologies and solutions
- Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges
- Troubleshooting and resolving technical issues as they arise
- Looking for ways of improving both what and how data pipelines are delivered by the department
- Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports
- Owning the delivery of data models and reports end to end
- Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future
- Working with Data Analysts to ensure that all data feeds are optimised and available at the required times.
- This can include Change Capture, Change Data Control and other "delta loading" approaches
- Discovering, transforming, testing, deploying and documenting data sources
- Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practises, and peer review
What makes you a great fit :
- Having 3+ years of extensive development experience using snowflake or similar data warehouse technology
- Having working experience with dbt and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Fivetran, AWS, git, Looker
- Experience in agile processes, such as SCRUM
- Extensive experience in writing advanced SQL statements and performance tuning them
- Experience in Data Ingestion techniques using custom or SAAS tool like fivetran
- Experience in data modelling and can optimise existing/new data models
- Experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets
Functional Areas: Software/Testing/Networking
Read full job description6-8 Yrs