140 hirist.tech Jobs
Encora Innovation Labs - ETL Developer - Data Mesh (7-9 yrs)
hirist.tech
posted 7d ago
Key skills for the job
Role : ETL Developer (7+ Years)
Location : Bangalore (WFO Hybrid)
Notice Period for Lead Immediate to 15 days; For Developer : Upto 30 Days
Job Title : Senior ETL Developer Data Mesh & Data Product Specialist
We are hiring a Senior ETL Developer to join our high-performance data engineering team. This is a mission-critical role where you will design, build, and maintain scalable ETL pipelines and frameworks in a Data Mesh architecture. You will work with modern tools like Python, dbt, BigQuery (GCP), and SQL to deliver high-quality data products that power decision-making across the organization.
We are looking for a highly skilled professional who thrives in demanding environments, takes ownership of their work, and delivers results with precision and reliability.
Key Responsibilities :
- Design, Build, and Maintain ETL Pipelines : Develop robust, scalable, and efficient ETL workflows to ingest, transform, and load data into distributed data products within the Data Mesh architecture.
- Data Transformation with dbt : Use dbt to build modular, reusable transformation workflows that align with the principles of Data Products.
- Cloud Expertise : Leverage Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Pub/Sub, and Dataflow to implement highly scalable data solutions.
- Data Quality & Governance : Enforce strict data quality standards by implementing validation checks, anomaly detection mechanisms, and monitoring frameworks.
- Performance Optimization : Continuously optimize ETL pipelines for speed, scalability, and cost efficiency.
- Collaboration & Ownership : Work closely with data product owners, BI developers, and stakeholders to understand requirements and deliver on expectations. Take full ownership of your deliverables.
- Documentation & Standards : Maintain detailed documentation of ETL workflows, enforce coding standards, and adhere to best practices in data engineering.
- Troubleshooting & Issue Resolution : Proactively identify bottlenecks or issues in pipelines and resolve them quickly with minimal disruption.
Required Skills & Experience :
- 10+ or 7+ years of hands-on experience in designing and implementing ETL workflows in large-scale environments (Lead & Dev)
- Advanced proficiency in Python for scripting, automation, and data processing.
- Expert-level knowledge of SQL for querying large datasets with performance optimization techniques.
- Deep experience working with modern transformation tools like dbt in production environments.
- Strong expertise in cloud platforms like Google Cloud Platform (GCP) with hands-on experience using BigQuery.
- Familiarity with Data Mesh principles and distributed data architectures is mandatory.
- Proven ability to handle complex projects under tight deadlines while maintaining high-quality standards.
- Exceptional problem-solving skills with a strong focus on delivering results.
What We Expect :
This is a demanding role that requires :
1. A proactive mindset you take initiative without waiting for instructions.
2. A commitment to excellence no shortcuts or compromises on quality.
3. Accountability you own your work end-to-end and deliver on time.
4. Attention to detail precision matters; mistakes are not acceptable.
Note : If shortlisted, you will be invited for initial rounds on 1st March'25 (Saturday) in Bengaluru
Functional Areas: Software/Testing/Networking
Read full job description0-3 Yrs
0-3 Yrs
2-3 Yrs
2-3 Yrs