1 Cenit Hub Job
Data Engineer - SQL/ETL (4-7 yrs)
Cenit Hub
posted 7d ago
Flexible timing
Key skills for the job
About the job :
Role : Data Engineer
Essential Skills :
- Working knowledge of DevOps practices and removing manual processes.
- Extensive experience with SQL and developing source control and release structure for databases.
- Experience working with large amounts of Data.
- Experience supporting multi-tier, consumer-facing web applications at more than just the UI level.
- Experience enabling batch / streaming ETL processes and pre-emptively monitoring for as well as capturing possible issues.
- Broad experience designing and maintaining automated tests for Whitebox and Blackbox testing.
- Experience with integrating tests and new technologies into in-place CI frameworks.
- Knowledge of various Agile ways of working (Scrum/Kanban) and their benefits.
- Working knowledge of DevOps and git repositories
In This Role, You Will :
- Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.
- Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.
- Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
- Writes unit/integration tests, contributes to engineering wiki, and documents work.
- Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
- Works closely with a team of frontend and backend engineers, product managers, and analysts.
- Defines company data assets (data models), spark, sparkSQL, and hiveSQL jobs to populate data models.
- Designs data integrations and data quality framework.
- Works closely with all business units and engineering teams to develop strategy for long term data platform architecture.
- Develop and maintain data pipelines for efficient data extraction, transformation, and loading (ETL) processes.
- Work on data transformations and processing using cloud data integration tools
- Develop and maintain data models within cloud databases like snowflake and related tools to support reporting, analytics, and business intelligence needs.
- Collaborate with cross-functional teams to understand data requirements and design appropriate data integration solutions.
- Integrate data from various sources, both internal and external, ensuring data quality and consistency.
- Ensure data models are designed for scalability, reusability, and flexibility.
- Implement data quality checks, validations, and monitoring processes to ensure data accuracy and integrity across cloud environments.
- Adhere to data governance standards and best practices to maintain data security and compliance.
- Handling performance optimizations in the cloud environments
- Maintain comprehensive documentation for data pipelines, processes, and architecture within both Azure and Snowflake
environments including best practices, standards, and procedures.
Functional Areas: Software/Testing/Networking
Read full job description