8 Talent Pipeline Jobs
Data Warehouse Engineer - BigQuery (7-12 yrs)
Talent Pipeline
posted 2mon ago
Flexible timing
Key skills for the job
Job Description :
Responsibilities :
- Architect, design, and maintain a scalable data warehouse using BigQuery to support business intelligence and analytics needs.
- Develop and optimize ETL/ELT pipelines using dbt to ensure accurate and timely data flow across systems.
- Integrate Salesforce and Marketing Cloud data into the data warehouse to provide actionable insights.
- Designed, developed, and maintained dashboards and reports using Google Looker Studio to visualize key business metrics.
- Collaborate with cross-functional teams to gather requirements and deliver data solutions that meet
organizational needs.
- Ensure data quality, integrity, and security across all systems and processes.
- Proactively monitor, troubleshoot, and enhance the performance of the data warehouse and associated pipelines.
Requirements :
- Proven experience designing and implementing data warehouses on Google BigQuery.
- Expertise in building and maintaining ETL/ELT pipelines using dbt.
- Proficiency with Google Looker Studio for data visualization.
- Strong experience with Salesforce ETL and Marketing Cloud integration.
- Solid understanding of SQL and data modeling best practices.
- Excellent problem-solving and analytical skills.
- Strong communication skills to effectively collaborate with technical and non-technical stakeholders.
Preferred Qualifications :
- Experience with cloud platforms like Google Cloud Platform (GCP).
- Familiarity with Python or other scripting languages for data manipulation.
- Knowledge of data governance and compliance standards.
- Experience in a healthcare or medical services environment.
Functional Areas: Other
Read full job description