We are seeking a Data Pipeline Engineer to join our cloud and data engineering team, a key part of the Data Pillar supporting the Contact Center. Our teams primary focus is building and managing public cloud infrastructure on Azure and developing robust data pipelines. These pipelines facilitate the transfer of data from various source systems into Azure Data Lake Storage (ADLS) and Azure Snowflake, ensuring seamless data integration and availability for advanced analytics and reporting.
Responsibilities:
- Responsible for designing, and building data pipelines. Right from prototyping and designing functional code to coding and testing.
- Applying data engineering principles to analyze, problem-solve, and design solutions.
- Designing and developing test cases.
- Adopting Agile methodologies to develop data pipelines.
- Analyzing existing systems to provide recommendations for improvement.
- Ensuring proper documentation.
- Monitoring system performance and performing predictive maintenance.
- Performing system risk and reliability analysis.
- Creating data pipelines to ingest data from source systems (e.g., Kafka, Cloud Storage, Snowflake) into Azure storage accounts and Snowflake.
- 3+ years of Python development experience.
- 3-4 years of data engineering experience, with strong expertise in Azure Data Factory and Databricks.
- 2+ years of experience ingesting data into Snowflake.
- Proficiency w
Employment Type: Full Time, Permanent
Functional Areas: Software/Testing/Networking
Read full job description