Precilincal data knowledge, SQL, Redshift,ETL development.
Overview.
We are seeking a skilled ETL Developer / Data Engineer to join our team and contribute to critical data engineering projects.
This role focuses on building and managing data pipelines, ensuring high-quality data flow from various sources, and supporting automation across environments.
You will work closely with our data science and analytics teams to deliver end-to-end solutions.
Key Responsibilities.
Develop and maintain ETL processes to integrate and transform data from various sources.
Design, implement, and optimize data pipelines using tools like Apache Airflow and Jenkins to automate workflows and ensure efficient scheduling.
Utilize GitHub Actions for CI/CD pipelines, enabling streamlined code deployment across environments.
Write and optimize complex SQL queries for data extraction, transformation, and loading.
Build robust data transformation scripts and ETL workflows in Python.
Troubleshoot and debug ETL workflows, ensuring high availability, data accuracy, and performance.
Collaborate with stakeholders to gather requirements and translate them into technical solutions.
Maintain best practices for data governance, quality, and Skills :.
ETL Development :Strong experience in ETL tools and techniques for building scalable, efficient data pipelines.
Jenkins :Proficiency in configuring and managing Jenkins for pipeline automation.
Apache Airflow :Experience in building and managing DAGs to orchestrate complex workflows.
GitHub Actions :Hands-on experience in using GitHub Actions to set up CI/CD workflows for ETL pipelines.
SQL :Advanced knowledge of SQL for data manipulation and ETL operations.
Python :Strong Python programming skills, particularly in data manipulation and workflow automation.