i
Experion Technologies
26 Experion Technologies Jobs
Gcp Data Engineer
Experion Technologies
posted 12d ago
Job Description
Role - Principal Data Engineer (GCP/AWS/Azure)
Total Experience - 9-15years
Work Location - Remote
Relevant Experience - 7+ years
Expected date of onboarding - Immediate - 45days|
Job Purpose
We are seeking a dynamic and highly skilled Principal Data Engineer or an Architect who has extensive
experience building enterprise scale data platforms and lead these foundational efforts. This role demands
someone who not only possesses a profound understanding of the data engineering landscape but is also at the
forefront of their game. The ideal candidate will contribute significantly to platform development with diverse
skillset while also being veryhands-on coding and actively shaping the future of our data ecosystem.
Job Description / Duties & Responsibilities
As a principal engineer or an Architect, you will be responsible for ideation, architecture, design
and development of new enterprise data platform. You will collaborate with other cloud and
security architects to ensure seamless alignment within our overarching technology strategy.
Architect and design core components with a microservices architecture, abstracting platform, and infrastructure intricacies.
Job Description
Create and maintain essential data platform SDKs and libraries, adhering to industry best practices.
Design and develop connector frameworks and modern connectors to source data from disparate systems both on-prem and cloud.
Design and optimize data storage, processing, and querying performance for large-scale datasets using industry best practices while keeping costs in check.
Architect and design the best security patterns and practices
Design and develop data quality frameworks and processes to ensure the accuracy and reliability of data.
Collaborate with data scientists, analysts, and cross functional teams to design data models,
database schemas and data storage solutions.
Design and develop advanced analytics and machine learning capabilities on the data platform.
Design and develop observability and data governance frameworks and practices.
Stay up to date with the latest data engineering trends, technologies, and best practices.
Drive the deployment and release cycles, ensuring a robust and scalable platform.
Job Specification / Skills and Competencies
7+ years of proven experience in modern cloud data engineering, broader data landscape experience
and exposure and solid software engineering experience.
Prior experience architecting and building successful enterprise scale data platforms in a green field environment is a must.
Proficiency in building end to end data platforms and data services in Cloud Platforms (AWS
or Azure or GCP):
AWS: Redshift, S3, Data Pipeline, Glue, EMR, Athena + Python and SQL.
Azure: Azure Data Factory (ADF), Azure Databricks, Synapse Analytics, Datalake
Storage + Python and SQL.
GCP: BigQuery, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc,
Airflow, PubSub + Python and SQL.
Experience with Microservices architectures - Kubernetes, Docker and Cloud Run
Experience building Symantec layers.
Proficiency in architecting and designing and development experience with batch and real time
streaming infrastructure and workloads.
Solid experience with architecting and implementing metadata management including data
catalogues, data lineage, data quality and data observability for big data workflows.
Hands-on experience with Cloud Ecosystem (AWS, Azure, GCP) and data lakehouse architectures.
Strong understanding of data modeling, data architecture, and data governance principles.
Excellent experience with DataOps principles and test automation.
Excellent experience with observability tooling: Grafana, Datadog.
Nice to have:
Experience with Data Mesh architecture.
Experience building Semantic layers for data platforms.
Experience building scalable IoT architectures
Employment Type: Full Time, Permanent
Read full job descriptionPrepare for Gcp Data Engineer roles with real interview advice