i
Skilltasy
27 Skilltasy Jobs
Google Cloud Platform Data Enginneer (4-12 yrs)
Skilltasy
posted 3d ago
Key skills for the job
Responsibilities :
- Design, build, and maintain efficient and reliable data pipelines using GCP services like Dataflow, DataProc, Data Fusion, dbt, or Dataform.
- Implement ETL/ELT processes to ingest, transform, and load data from various sources into BigQuery and other data stores.
- Optimize data pipelines for performance, scalability, and cost-effectiveness.
- Develop and optimize complex SQL queries and stored procedures in BigQuery.
- Design and implement data models and schemas in BigQuery for analytical and reporting purposes.
- Troubleshoot and resolve BigQuery performance issues.
- Utilize Google Cloud Storage (GCS) for data storage and retrieval.
- Orchestrate data workflows using Composer (Airflow) for scheduling and monitoring.
- Leverage DataProc for large-scale data processing using Spark and Hadoop.
- Write clean, efficient, and well-documented code in Python, PySpark, or Java.
- Develop scripts for automating data processing and infrastructure management.
- Implement data quality checks and validation processes.
- Ensure data security and compliance with relevant regulations.
- Adhere to data engineering best practices and coding standards.
- Work closely with data scientists, analysts, and other stakeholders to understand data requirements. -
- Communicate effectively with technical and non-technical audiences. -
- Participate in code reviews and knowledge sharing sessions.
- Contribute to agile development processes and participate in sprint planning, stand-ups, and retrospectives.
- Diagnose and resolve data related issues.
- Provide support for production data pipelines.
Required Skills and Experience :
- Minimum 4-12 years of experience in data engineering, with at least 2 years focused on GCP.
- Strong proficiency in BigQuery, Dataflow, and DataProc.
- Excellent programming skills in Python, PySpark or Java.
- Strong SQL and BQ SQL skills.
- Hands-on experience with Google Cloud Storage (GCS).
- Experience with Composer (Airflow) for workflow orchestration.
- Experience with Data Fusion, dbt, or Dataform.
- Solid understanding of ETL/ELT concepts and data warehousing principles.
- Experience in building and maintaining data pipelines.
- Experience handling large datasets.
- Experience working in Agile development environments.
Preferred Qualifications :
- Google Cloud Professional Data Engineer certification.
Functional Areas: Other
Read full job description6-8 Yrs