Service Operation and end to end delivery responsible.
Strong functional approach required as this role involves regular interactions with business users and key users.
Experience in DevOps and Familiar with docker and Gitlab for CICD
Think always with self-help and self-administered perspective of new tool.
Build up deep knowledge within DAsh with Snowflake model.
Accommodating change requirements and new mapping
Setting up an agile way of data replication and strong data service models.
Establish and promote DaaS- Data as a Service culture with ease of user experience.
Exhibit pro-activeness in getting involved in other organizational management discussions and initiatives.
Excellent management and mentor skills.
Organizational/planning skills with the ability to coordinate and manage multiple complex projects and organize workloads in a structured way within tight deadlines.
Cross Function Collaboration.
Flexibility to work with people in different time zones, e.g., Europe.
Critical thinking skills and ability to work under stress.
Organized approach to solve problem with good decision-making skills.
Multi-tasking skills and ability to meet deadlines.
Agile Practitioner.
Identify the right technology stack and tools that best meet the customers requirements
Design end-to-end solutions along with data strategy, including standards, principles, data sources, storage, pipelines, data flow, and data security policies
Define the right security and access policies
Collaborate with data engineers, data scientists, and other stakeholders to complete the data strategy
Perform data analysis to understand the relevant data sets
Implement the Snowflake standard methodologies
What You Bring / Skills, Capabilities
Total 6-8 years of intensive hands on different ETL tools and platforms and minimum 4 years in Snowflake.
Intensive working knowledge of snowflake SQL Scripting.
Working knowledge of Python and PL/SQL with good knowledge of creating Procedures.
Good knowledge in Snowflake architecture concepts - Cloud Services, Virtual Warehouse, Storage
Good exposure in performance tuning, optimization on snowflake.
Expert in Data modeling skills on relational databases especially for data warehousing
Experience into Data Warehouse and Planning
Knowledge in building End - to end data pipeline.
Master s or bachelor s degree in computer science or IT.
Experience in building data ingestion pipeline using AWS, Azure GCP data services.
Knowledge of Control-M / Airflow or any Job scheduler
Solid understanding of Database and Datawarehouse concepts
Decent experience in migration projects, migrating from On-prem to Snowflake
Extensive knowledge of the Snowflake capabilities such as Snow pipe, SnowSQL, Streams, and tasks
Knowledge on Snowflake AI ML capabilities like: Cortex Search, Cortex Analyst, Document AI, Cortex LLM Functions etc.
Experience in architecture and awareness of Snowflake roles and user security, RBAC model
In-depth knowledge and experience in data migration from RDBMS to Snowflake cloud data warehouse
Experience in implementing Snowflake integration with modern data stack tools
Well-versed with Incremental extraction loads - batched and streaming
Ability to architect data architecture solutions that are in alignment with the strategic technology roadmap and emerging industry trends
Strong exposure to Data Modelling, Data Access Patterns, and SQL
Knowledge of Cost Management, Infrastructure Planning Disaster Recovery
Good experience in Data Governance and a good understanding of DevOps practices
Understanding of Data Virtualization
Experience in working on Big Data Platform
Effective communication (verbal and written) and effective interpersonal skills
Top Siemens Energy Senior Data Engineer Interview Questions
Q1.1. introduction 2. what is transformer and function of transformer 3. types of transformer 4. why only 11kv will be generated at generation ... read more
Q2.1) What do you know about the compressor ? 2) How Many different types of compressors are their ? 3) what do you mean by double acting compr... read more