i
Coders Brain
523 Coders Brain Jobs
DBT Engineer - ETL/Talend (5-8 yrs)
Coders Brain
posted 11hr ago
Flexible timing
Key skills for the job
Position : DBT Engineer (Contract-to-Hire)
Location : Remote
Experience Required : 5+ years
Employment Type : Contract-to-Hire (C2H)
Job Responsibilities :
Snowflake-Based Analytics Solution Design :
- Assist in the design and implementation of a Snowflake-based analytics solution for data lakes and data warehouses on AWS.
- Define requirements, perform source data analysis and profiling, and work on logical and physical design of data lakes and data warehouses.
- Design data integration and publication pipelines.
Snowflake Deployment and Best Practices :
- Develop best practices for Snowflake deployment and usage.
- Educate the team on the capabilities and limitations of Snowflake.
Data Pipeline Development & Management :
- Build, maintain, and optimize data pipelines for data ingestion, transformation, and storage using DBT and AWS services like S3.
- Create and maintain ETL pipelines from various data sources using tools like Talend, DBT, S3, and Snowflake.
Collaboration with Cross-functional Teams :
- Work closely with internal and external stakeholders, including data architects, data scientists, and data analysts to resolve technical issues.
- Act as a technical leader within the team.
Agile Methodology :
- Work in an Agile/Lean model to ensure timely delivery of high-quality solutions.
- Collaborate with teams across different time zones to ensure smooth project execution.
Requirements Translation :
- Translate complex business requirements into technical solutions to improve the efficiency of the data processing system.
Data Modeling & Management Systems :
- Build, test, and maintain data management systems ensuring they are scalable and optimized.
- Ensure data governance and quality standards are met.
Essential Skills, Education, and Experience :
Education :
- B.E., B.Tech, MCA, or equivalent degree.
Experience :
- 4-7 years of experience in Data Engineering with a strong focus on DBT and AWS environments.
Technical Skills :
- DBT Concepts : Experience with model building, configurations, incremental load strategies, macros, and DBT tests.
- Advanced SQL : Proficient in writing complex SQL queries for data processing and management.
- AWS Experience : Strong hands-on experience in managing and deploying data pipelines on AWS (including S3, Lambda, Redshift, etc.).
- Snowflake Experience : Solid experience with Snowflake, including design, development, and best practices.
- Data Pipeline Architecture : Experience in creating and maintaining efficient data pipelines for large-scale data ingestion and processing.
- ETL Tools : Familiarity with tools like Talend, DBT, and working with AWS S3 for data storage.
Soft Skills :
- Strong communication skills, with the ability to collaborate with teams across different time zones.
- Problem-solving mindset and a desire to learn new technologies.
- Ability to mentor and guide team members on DBT best practices.
Nice to Have Skills :
- AWS Data Services Development Experience: Experience with AWS data services (e.g., Redshift, Lambda, etc.).
- Big Data Technologies : Knowledge of Big Data tools and technologies.
- Data Quality & Governance : Exposure to working with data quality and data governance teams.
- Reporting Tools : Familiarity with reporting tools like Tableau.
- Airflow & Kafka : Experience with Apache Airflow for workflow orchestration and Apache Kafka for stream processing (Nice to have).
- Domain Knowledge : Understanding of Payments, CRM, Accounting, or Regulatory Reporting domains.
Key Result Areas (KRAs) :
Data Pipeline Delivery :
- Design, build, and maintain efficient data pipelines adhering to best practices for scalability and reliability.
Technical Leadership :
- Provide technical leadership within the team, ensuring that solutions are designed and implemented effectively.
Timely Delivery :
- Deliver high-quality solutions on time while maintaining strong collaboration with cross-functional teams.
Data Quality & Integrity :
- Ensure that data quality and integrity standards are met, and that pipelines are efficient, robust, and compliant.
Functional Areas: Other
Read full job descriptionPrepare for Engineer roles with real interview advice
5-10 Yrs