4 Sun Consultants Jobs
Cloud Data Engineer - Python/BigQuery (5-8 yrs)
Sun Consultants
posted 12d ago
Key skills for the job
Job Role : Cloud Data Engineer.
Notice Period : Immediate to 20 days.
Location : Bengaluru, Karnataka.
Responsibilities :
- Lead data management and engineering projects within GCP environments, including migrating, analyzing, and managing data structures, as well as designing, developing, testing, and implementing GCP data solutions.
- Guide customers on all aspects of data handling on GCP, including ingestion, storage, processing, analysis, and visualization.
- Collaborate with technical and business leads to translate business requirements into effective GCP solutions and implementations.
- Partner with data scientists, software engineers, and other stakeholders to support data acquisition, solution design, implementation, and ongoing maintenance.
- Design and implement data pipelines for batch, micro-batch, and real-time data streams.
- Ensure data quality by implementing data quality rules and test cases.
- Work closely with the GCP Security Team to ensure datastores are protected.
- Liaise with clients to gather requirements and ensure successful project completion.
- Conduct client workshops to identify data sources, flows, and requirements, as well as shape their future data strategy through future state architectures, roadmaps, and implementation plans.
- Develop logical data models and schemas for various business domains and industries.
- Demonstrate expertise in data modeling, database design, cloud architecture, and data infrastructure management.
Qualifications :
- 5+ years of experience in data engineering, cloud architecture, or data infrastructure.
- 3+ years of experience working with GCP or similar cloud platforms (Azure, AWS).
- Experience with GCP managed data services (Cloud Bigtable, Cloud Spanner, Cloud SQL, BigQuery).
- Experience with relational databases, legacy database/application migration, and cloud-based databases (BigQuery, Snowflake).
- Proven ability to develop logical data models, ETL/ELT processes, and related documentation.
- Mastery of data modeling concepts, large-scale database implementations, and design patterns.
- Experience working with structured, semi-structured, and unstructured data sources.
- Strong analytical, problem-solving, and troubleshooting skills.
- Excellent written and oral communication skills.
- Hands on experience with Python, GIT
- Infrastructure as Code experience (Terraform, Ansible, etc.)
- Experience with Deploying a Data Governance Program is a plus
Functional Areas: Other
Read full job description