This role works in building, managing and optimizing data pipelines, views and models, whilst working with our teams to move these (and additional models) effectively into production for key data and analytics consumers. There role also holds the opportunity to get involved in the end visualisation requirements, working closely with the clients to let them maximise value from their data.
Skills & Experience | Essential
Experienced in ETL/ELT techniques integrating into Data Warehouse solutions.
Solid understanding of Relational Databases & Data Warehouse methodology such as Azure Data Factory
Knowledge of various architectures and methodologies like metadata management, performance management and handling data quality issues.
Create and manage data environments in Snowflake.
Manage and monitor the data integration process.
Experience working in a cloud architecture with data lakes.
Excellent SQL skills.
Develop and operate efficient data pipelines which are scalable and reliable.
Tooling & skills
Working experience into Snowflake and Azure.
Experience with C#, SQL, Python
Understanding of Kimball dimensional model will be desirable.
Certified Snowflake SnowPro is desirable
Snowflake
We are very keen on implementing best practice solutions as well as staying at the forefront of new updates constantly to keep our customers up to date. Snowflake are making huge updates every few months, so it is important that our data engineering team keep up to date.
Azure
Azure Data Factory - for big data ingest
Azure Functions (C#) - for dynamically scaling / high velocity / middleware, ingest or API