19 Brainwork TechnoSolutions Jobs
Data Engineer - Google Cloud Platform (5-8 yrs)
Brainwork TechnoSolutions
posted 7d ago
Flexible timing
Key skills for the job
Job Summary :
We are seeking a highly skilled and experienced GCP Data Engineer to join our team.
The ideal candidate will have a strong background in Google Cloud Platform (GCP), particularly with BigQuery, and proficiency in Python and SQL.
You will be responsible for designing, building, and maintaining data pipelines, data marts, and business critical reports, contributing to our data governance initiatives.
This role requires a proactive problem-solver with a passion for data technologies and a proven track record of delivering high-quality solutions.
Responsibilities :
Data Pipeline Development & Implementation :
- Design, develop, and implement robust and scalable data pipelines using Python, SQL, BigQuery, and Airflow (or similar orchestration tools).
- Build and optimize data pipelines for efficient data ingestion, transformation, and loading.
- Automate data workflows and ensure data quality and reliability.
Data Warehousing & Data Marts :
- Design and build data marts to support business intelligence and reporting needs.
- Implement data warehousing best practices and ensure data integrity.
- Optimize data models and schemas for performance and scalability.
Reporting & Analytics :
- Build various business-critical reports to provide insights to stakeholders.
- Develop and maintain data visualizations and dashboards.
- Collaborate with business stakeholders to understand reporting requirements and deliver actionable insights.
Data Governance :
- Implement and enforce data governance policies and procedures.
- Ensure data security and compliance with relevant regulations.
- Manage data quality and metadata.
GCP Infrastructure & Migration :
- Utilize GCP services effectively, particularly BigQuery, for data storage and processing.
- Participate in data migration projects, ensuring smooth transitions and data integrity.
- Optimize GCP resources for cost efficiency and performance.
Collaboration & Communication :
- Collaborate with various business stakeholders to understand data requirements and deliver solutions.
- Communicate effectively with team members and stakeholders, providing clear and concise updates.
- Work in a fast-paced environment and adapt to changing priorities.
Required Skills :
- BigQuery (Mandatory)
- Cloud Storage
- Cloud Composer (Airflow) or similar orchestration tools (Preferred)
- Python (Mandatory)
- SQL (Mandatory)
- Data Warehousing Concepts
- ETL/ELT Processes
- Data Modeling
- Data Quality Management
- Metadata Management
Other :
- Experience with migration projects (Plus)
- Knowledge of data visualization tools (e., Looker, Tableau) (Plus)
- Understanding of DevOps practices (Plus)
Qualifications :
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 5-8 years of experience as a Data Engineer.
- Proven experience with GCP services, especially BigQuery.
- Strong proficiency in Python and SQL.
- Experience with data warehousing and ETL/ELT processes.
- Experience with data pipeline orchestration tools (Airflow or similar).
- Knowledge of data governance principles and practices.
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration skills.
- Ability to work independently in a remote environment.
Functional Areas: Software/Testing/Networking
Read full job description