9 SM IT Services Jobs
Data Engineer - SQL/ETL/Google Cloud Platform (4-10 yrs)
SM IT Services
posted 2mon ago
Company : MNC
Location : Chennai
Exp : 4 - 10 yrs
Position Overview :
We are seeking a highly skilled GCP Data Engineer to join our team. The ideal candidate will have a strong background in data engineering, a deep understanding of Google Cloud Platform (GCP) services, and hands-on experience with SQL, Python, BigQuery, Dataflow/Airflow, and Dataproc. As a GCP Data Engineer, you will be responsible for designing, building, and maintaining scalable data pipelines and ensuring the efficient processing and storage of large datasets.
Key Responsibilities :
- Design, develop, and maintain scalable and robust data pipelines using GCP services.
- Work with BigQuery to design and optimize data warehouses and analytical solutions.
- Develop data processing workflows using Dataflow and/or Apache Beam.
- Implement and manage ETL processes and data orchestration with Airflow.
- Leverage Dataproc for big data processing tasks using Hadoop, Spark, and other technologies.
- Write efficient SQL queries for data extraction, transformation, and loading (ETL).
- Utilize Python for data manipulation, automation, and integration tasks.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Ensure data quality, integrity, and security across all data platforms.
- Monitor, troubleshoot, and optimize data processing workflows for performance and reliability.
- Stay up-to-date with the latest trends and best practices in data engineering and GCP services.
Qualifications :
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Proven experience as a Data Engineer with a focus on GCP.
- Proficiency in SQL and experience with relational databases.
- Strong programming skills in Python.
- Hands-on experience with BigQuery for data warehousing and analytics.
- Experience with data processing frameworks such as Dataflow and/or Apache Beam.
- Knowledge of Airflow for data pipeline orchestration.
- Familiarity with Dataproc and big data technologies like Hadoop and Spark.
- Understanding of data modeling, ETL processes, and data warehousing concepts.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
Preferred Skills :
- Experience with other GCP services such as Cloud Storage, Pub/Sub, and Cloud Functions.
- Knowledge of data security and compliance best practices.
- Familiarity with machine learning workflows and tools on GCP.
- Experience with version control systems like Git.
Functional Areas: Software/Testing/Networking
Read full job description6-10 Yrs
Chennai, Pune