i
kipi.ai
1 kipi.ai Job
5-6 years
Kipi.ai - Senior Data Specialist - Data Warehousing (5-6 yrs)
kipi.ai
posted 17d ago
Flexible timing
Key skills for the job
Position Overview :
We are looking for an experienced Senior Data Specialist to join our team.
The ideal candidate will have hands-on experience in modern data engineering practices, particularly with Snowflake, DBT Core, and various data engineering tools. You will play a crucial role in designing, building, and maintaining scalable data pipelines to enable data analytics, reporting, and business intelligence.
The role requires expertise in data integration, automation, and workflow optimization for performance and scalability.
Key Responsibilities :
- Design, build, and optimize data models and pipelines using Snowflake, DBT Core, and SQL.
- Develop scalable and efficient ETL/ELT processes to support large-scale data ingestion, transformation, and integration.
- Implement, maintain, and optimize data pipelines using Airflow for scheduling and orchestration.
- Integrate CI/CD processes for data pipeline automation and continuous deployment, ensuring high-quality, error-free pipeline deployment (leveraging DataOps practices).
- Use Terraform to define and manage Infrastructure as Code (IaC), ensuring seamless provisioning and management of data engineering resources on cloud platforms.
- Work with AWS Services, including Lambda, to support serverless computing and scalable, event-driven solutions.
- Design and implement REST APIs and services to integrate various data sources and systems efficiently.
- Collaborate with external teams for seamless data integration across systems, ensuring consistency and reliability in data flows.
- Work with cloud platforms like AWS, Azure, or Google Cloud Platform to design scalable data solutions.
- Ensure effective integration and data processing using Kafka for real-time data streaming and messaging.
- Develop and manage ETL workflows using tools such as Fivetran (or other similar tools) to simplify data pipeline management and reduce manual intervention.
- Automate the extraction, transformation, and loading of data into various systems, ensuring high-quality data for analytics and reporting.
Must-Have Skills :
- Expertise in designing and managing data models and pipelines using Snowflake as a data warehousing solution.
- Strong experience in using DBT Core (Data Build Tool) for data transformations, including building models, running tests, and automating workflows.
- Advanced proficiency in SQL for writing complex queries, data manipulation, and optimization.
- Strong programming skills in Python for writing scripts, developing APIs, and building data engineering solutions.
- Proficiency in using Git for version control and collaboration within data engineering teams.
- Hands-on experience with Apache Airflow for orchestrating and automating data workflows and pipeline management.
- Experience in setting up continuous integration and continuous deployment pipelines (CI/CD) for data engineering workflows to ensure code quality and operational efficiency.
- Proficiency in using Terraform for infrastructure automation, enabling Infrastructure as Code (IaC) for scalable and reproducible environments
Functional Areas: Software/Testing/Networking
Read full job descriptionPrepare for Senior Data Specialist roles with real interview advice