We are seeking a Data Engineer in Hyderabad (WFO) with expertise in data engineering, ETL, and Snowflake development. The role involves SQL scripting, performance tuning, Matillion ETL, and working with cloud platforms (AWS, Azure, or GCP). Proficiency in Python or scripting languages, API integrations, and knowledge of data governance is required. Snowflake certifications (SnowPro Core/Advanced) are preferred.
Responsibilities
Minimum 5+ years of experience in data engineering, ETL, and Snowflake development.
Proven expertise in Snowflake including SQL scripting, performance tuning, and data warehousing concepts.
Hands-on experience with Matillion ETL for building and maintaining ETL jobs.
Strong understanding of cloud platforms (AWS, Azure, or GCP) and cloud-based data architectures.
Proficiency in SQL, Python, or other scripting languages for automation and data transformation.
Experience with API integrations and data ingestion frameworks.
Knowledge of data governance, security policies, and access control within Snowflake environments.
Excellent communication skills with the ability to engage both business and technical stakeholders.
Self-motivated professional capable of working independently and delivering projects on time.
Skills Required:
Data Engineering Snowflake: Expertise in data engineering, ETL processes, Snowflake development, SQL scripting, and performance tuning.
ETL Cloud Platforms : Hands-on experience with Matillion ETL, cloud platforms (AWS, Azure, or GCP), and API integrations.
Programming Data Governance: Proficiency in Python or scripting languages with knowledge of data governance, security policies, and access control.