Location PAN INDIA (Noida,Pune, Hyderabad, Bangalore, Chennai) Mandatory skills: Snowflake, SQL, dimensional data modeling, stored procedures. Qualifications: Strong proficiency in SQL and Snowflake SQL. In-depth understanding of ETL concepts and data warehousing principles. Experience with dimensional data modeling and data warehousing best practices. Proven ability to write efficient and maintainable stored procedures. Experience with Azure Data Factory (ADF) or other data integration tools. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Ability to work independently and as part of a team. A passion for data and a desire to learn new technologies. Responsibilities: Data Modeling and Engineering: Design, develop, and maintain robust dimensional data models (star and snowflake schemas) to transform raw data into a consumable data warehouse layer. Stored Procedure Development: Create efficient Snowflake stored procedures to automate data ingestion, transformation, and movement across the data layers in a medallion architecture. ADF Integration: Integrate Snowflake with Azure Data Factory (ADF) to orchestrate and automate data pipelines, ensuring seamless data flow and timely delivery. SQL Proficiency: Write complex SQL queries to analyze, manipulate, and extract insights from large datasets, leveraging advanced SQL functions (joins, aggregations, window functions, etc.). Snowflake Expertise: Leverage advanced Snowflake features to ensure optimal performance and cost-effectiveness. Quality Assurance: Implement data quality checks and monitoring mechanisms to ensure data accuracy and consistency. Performance Optimization: Identify and resolve performance bottlenecks, optimize queries, and implement indexing strategies to improve query execution times. Collaboration: Work closely with data analysts, business analysts, and other stakeholders to understand their data needs and translate them into technical solutions.