We are looking for passionate and experienced Data Engineer to work with our other 70+ software, data and DevOps engineers to guide and assist our clients data modernization journey.
This is a contract opportunity to work with our team who supports companies with ambitious missions - clients who are creating new, innovative products, often in uncharted markets. We work as embedded members and leaders of our clients development and data teams. We bring experienced senior engineers, leading-edge technologies and mindsets, and creative thinking. We show our clients how to move to the modern frameworks of data infrastructures and processing, and we help them reach their full potentials with the power of data.
In this role, you'll be the day-to-day primary point of contact with our clients to modernize their data infrastructures, architecture, and pipelines.
you'll be responsible for:
Consulting with clients on cloud-first strategies for core bet-the-company data initiatives.
Providing thought leadership on both process and technical matters.
Becoming a real champion and trusted advisor to our clients on Data & Analytics Platforming and Engineering.
Designing, developing, deploying, and supporting the data modernization and transformation of our client s end-to-end data strategy, including infrastructure, collection, transmission, processing, and analytics.
Mentoring and educating client s teams to keep them up to speed with the latest approaches, tools and skills, and setting them up for continued success post-delivery.
What we're looking for:
Experience using Microsoft Azure in Data Engineering, such as Azure Data Factory, Databricks, Data Lake, and Synapse.
Strong understanding and some delivery experience with Data Mesh, Delta Lake, and Lakehouse Architecture are nice to have.
Understanding of security and corporate governance issues related with cloud-first data architecture, as we'll as accepted industry solutions.
Experience in enabling continuous delivery for development teams using scripted cloud provisioning and automated tooling.
Strong knowledge of Kafka and Spark and extensive experience using them in a production environment.
Strong knowledge of Confluent Platform and Confluent Cloud.
Strong knowledge of Scala or Python.
Strong knowledge of securing Azure Platforms, such as RBAC, Key Vault, and Azure Security Center.
Strong knowledge in Infrastructure as Code (IaC) experience using Terraform, PowerShell, Bash, and ARM Templates;
Experience working with Agile development teams.
Sound business judgment and demonstrated leadership.
Willing to learn and grow in the space of Data & AI
Nice to have:
Hands-on experience with machine learning, data science, and AI frameworks like TensorFlow, PyTorch.
Hand-on experience with applying generative AI, vector databases, and RAG models in creative and innovative ways.
Understanding of Data Governance and Data Quality practices and tools
Hands-on experience with Data Governance or data cataloging tools such as Purview, Collibra, Unity Catalog or Atlan.
Employment Type: Full Time, Permanent
Functional Areas: Analytics & Business Intelligence