Translating architecture designs into technical implementations.
Building dynamic meta-data driven data ingestion patterns using Azure Data Factory and Databricks.
Building and maintaining the Enterprise Data Warehouse (using Kimball methodology)
Building and maintaining business focused data products and data marts.
Building and maintaining Azure Analysis Services databases and cubes.
Ad hoc data analysis and data wrangling using Azure Synapse Analytics and Databricks.
Implementing and delivering to all stages of the data engineering using Investec Data Experience (IDX) templates into ODP (One data platform).
Playing a supporting and mentoring role to junior data engineers.
Develop, implement, and maintain relevant documentation, guidelines, checklists, and policies to promote continuous integration, ensure and improve data security, and reduce the possibility of human error.
Sharing support and operational duties within the team.
Working closely with end-users to understand their business and their data requirements.
Requirement & Skills:
Knowledge of distributed data mesh deployment methodology.
6+ years of experience as a data engineer.
Experience in building robust and performant ETL processes.
Excellent data analysis and exploration using T-SQL
Demonstrated expertise in developing and optimising large-scale data processing applications using Apache Spark. Proficient in writing complex Spark jobs to process and analyse vast datasets efficiently.
Skilled in leveraging Pandas for data manipulation and analysis tasks. Capable of transforming raw data into actionable insights by utilising Pandas powerful data structures and functions.
Extensive experience with SQL Server and SSIS
Knowledge and experience of data warehouse modelling methodologies (Data Vault 2.0, Kimball)
Experience in Azure one or more of the following: Azure Data Factory, Databricks, Azure Synapse Analytics, ADLS Gen2
Understanding and experience in Azure bicep templates for automating deployment of pipelines.
Python and SQL (stored procedures, functions)
Understanding of systems development, project management approaches.
Build and maintain Analysis Services databases and cubes (both multidimensional and tabular)
Experience in using source control, preferably GIT.
Basic knowledge in Infrastructure as Code (IaC) - scripting and automation via Azure CLI, PowerShell, Bash, Bicep and JSON ARM and Bicep templates.
Understanding and experience of DevOps deployment pipelines
Effective communication and collaboration skills to work in cross-functional teams, participate in code reviews, and communicate technical concepts to non-technical stakeholders.
Willingness to learn new technologies, keep up with industry trends, and adapt to evolving development practices and too