The Snowflake Solution Architect takes ownership to collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions leveraging the Snowflake platform. This position aims to enhance team effectiveness through high-quality and timely contributions while primarily adhering to standardized procedures and practices to achieve objectives and meet deadlines, exercising discretion in problem-solving. This role will be based in Bangalore India, and reporting to the Head of SAC Snowflake Engineering.
Design, develop, and maintain sophisticated data pipelines and ETL processes within Snowflake
Craft efficient and optimized SQL queries for seamless data extraction, transformation, and loading
Leverage Python for advanced data processing, automation tasks, and integration with various systems
Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2
0 methodologies
Oversee and optimize CI/CD pipelines using Azure DevOps to ensure smooth deployment of data solutions
Uphold data quality, integrity, and compliance throughout the data lifecycle
Troubleshoot, optimize, and enhance existing data processes and queries to boost performance
Document data models, processes, and workflows clearly for future reference and knowledge sharing
Employ advanced performance tuning techniques in Snowflake to optimize query performance and minimize data processing time
Develop and maintain DBT models, macros, and tests for efficient data transformation management in Snowflake
Manage version control using Git repositories, facilitating seamless code management and collaboration
Design, implement, and maintain automated CI/CD pipelines using Azure DevOps for Snowflake and DBT deployment processes
Who You Are:Hold a Bachelor s or Master s degree in Computer Science, Information Technology, or a related field
A minimum of 5-7 years of proven experience as a Snowflake developer/architect or in a similar data engineering role
Extensive hands-on experience with SQL and Python, showcasing proficiency in data manipulation and analysis
Significant industry experiences working with DBT (Data Build Tool) for data transformation
Strong familiarity with CI/CD pipelines, preferably in Azure DevOps
Deep understanding of data modelling techniques (OLTP, OLAP, DBT, Data Vault 2
0) and best practices
Experience with large datasets and performance tuning in Snowflake
Knowledge of data governance, data security best practices, and compliance standards
Familiarity with additional data technologies (eg, AWS, Azure, GCP, Five Tran) is a plus
Experience in leading projects or mentoring junior developers is advantageous