Design, build, and maintain the data infrastructure and systems that support SKF VA data needs. By leveraging their skills in data modeling, data integration, data processing, data storage, data retrieval, and performance optimization, this role can help VA manage and utilize their data more effectively.
Key Responsibilities
Build an VA data warehouse which is scalable, secured, and compliant using snowflake technologies. This would include designing and developing Snowflake data models
Work with Central data warehouse like SDW, MDW, OIDW to extract data and enrich with VA specific customer grouping, program details etc
Data integration: Responsible for integrating data from ERP s, BPC and other systems into Snowflake, SKF standard DW s ensuring that data is accurate, complete, and consistent.
Performance optimization: Responsible for optimizing the performance of Snowflake queries and data loading processes. Involves optimizing SQL queries, creating indexes, and tuning data loading processes.
Security and access management: Responsible for managing the security and access controls of the Snowflake environment. This includes configuring user roles and permissions, managing encryption keys, and monitoring access logs.
Technical metrics: Data quality for whole of VA BU, data processing time, data storage capacity and systems availability
Business metrics: data driven decision making, data security and compliance, cross functional collaboration.
Competencies
Should have a good understanding of data modeling concepts and should be familiar with Snowflakes data modeling tools and techniques.
Specialized in native data applications development, with a proven track record of expertise in Snowflake, Snowpark Packages, Streamlit and more.
Experienced in creating streamlined and efficient data applications, leveraging the capabilities of modern, open-source packages for user interface development and Snowflake for seamless data integration.
SQL: Should be expert in SQL. Should be able to write complex SQL queries and understand how to optimize SQL performance in Snowflake.
ETL/ELT: Should be familiar with Snowflakes ETL/ELT tools and techniques.
Should have a good understanding of cloud computing concepts and be familiar with the cloud infrastructure on which Snowflake operates.
Good understanding of data warehousing concepts and be familiar with Snowflakes data warehousing tools and techniques
Familiar with data governance and security concepts
Able to identify and troubleshoot issues with Snowflake and SKF s data infrastructure
Experience with Agile solution development
Good to have - knowledge on SKF ERP systems (XA, SAP, PIM etc), data related sales, supply chain data, manufacturing.
Candidate Profile :
Bachelor s Degree in computer science, Information technology or a related field
5 - 8 Years of overall experience with minimum two years of experience in Snowflake