About the Role: Keka is seeking for a highly skilled Data Engineer with expertise in Snowflake to join our dynamic data engineering team. The ideal candidate will design, build, and maintain scalable and efficient data pipelines, enabling robust data analytics and business intelligence solutions. This role requires strong experience in cloud-based data warehousing, ETL processes, and performance optimization.
Key Responsibilities:
Design, develop, and maintain scalable data pipelines and integrations using Snowflake.
Build and optimize data warehouse architecture in Snowflake.
Develop ETL processes for data ingestion, transformation, and integration from multiple sources.
Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
Monitor and optimize Snowflake performance, ensuring data security and compliance.
Create and maintain documentation for data architecture, pipelines, and ETL processes.
Implement data quality checks and ensure data accuracy and reliability.
Troubleshoot and resolve data-related issues promptly.
Required Qualifications:
Bachelor s degree in computer science, Data Engineering or related field.
Proven experience as a Data Engineer with a focus on Snowflake.
Proficiency in SQL, Python, or other programming languages for data manipulation.
Hands-on experience with ETL tools and data integration frameworks.
Familiarity with cloud platforms (e.g., AWS, Azure, or GCP).
2-6 years of experience in data engineering or related roles
Proficiency in programming languages such as Python, Java, or Scala
Experience with big data technologies such as Hadoop, Spark, or Kafka
Experience with version control tools (e.g., Git).
Excellent problem-solving skills and attention to detail.
Preferred Qualifications:
Snowflake certification is a plus (e.g., SnowPro Core Certification).
Experience with data visualization tools (e.g., Tableau, Power BI).
Knowledge of DevOps practices and CI/CD pipelines.