14 Spigot Software Jobs
Data Engineer - Python/SQL (3-9 yrs)
Spigot Software
posted 2mon ago
Flexible timing
Key skills for the job
Role Description :
The data engineering role requires creating and managing technical solutions on the modern data stack (Snowflake, dbt, fivetran, Azure), architecting, building, and managing data flows/pipelines and construct data models within Snowflake to be used for other downstream processes or for analysis purposes.
Role Responsibility :
- Design, develop, and maintain scalable data pipelines and analytics solutions using DBT, Snowflake, and related technologies
- Collaborate with stakeholders to gather requirements and translate business needs into technical solutions
- Develop efficient code with unit testing and code documentation
- Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving
- Optimize and fine-tune data models, SQL queries, and transformations for performance and scalability
- Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Snowflake,
- ensure the effective transformation and load data from diverse sources into data warehouse or data lake.
- Integrate FiveTran connectors to streamline data ingestion from various sources into Snowflake
- Develop custom Python scripts and functions to automate data workflows and enhance system capabilities
- Provide technical expertise and support to resolve data-related issues and troubleshoot system failures
- Collaborate with the API development team to integrate data pipelines with external systems and applications
- Contribute to the development of web-based data visualization solutions and dashboards
- Communicate with all the project stakeholders on the project status
- Manage, monitor, and ensure the security and privacy of data to satisfy business needs
- Contribute to the automation of modules, wherever required
- Stay updated on emerging trends and technologies in data engineering, cloud computing, and analytics domains
Role Requirements :
- Bachelor's degree in Computer Science, Engineering, or a related field
- Proven experience of at least 5 years as a data engineer, ETL developer, or similar role, with a focus on DBT and Snowflake
- Strong proficiency in SQL and database concepts, with hands-on experience in Snowflake data warehouse Subject Matter expert of Data warehouse and Data Lake concepts (Dimensional Modelling, change data capture, slowly changing dimensions etc.)
- Proficiency in programming languages such as Python for data manipulation, automation, and scripting
- Knowledgeable in relational databases, nonrelational databases, data pipelines (ELT/ETL), and file stores
- Knowledgeable in performance tuning and optimization
- Experience with cloud platforms like Azure and tools like Azure OCR for data extraction and processing
- Familiarity with data integration tools like FiveTran
- Knowledge of API development principles and experience integrating data pipelines with external systems
- Proficient in written, verbal and presentation communication (English)
- Ability to work effectively in a fast-paced, collaborative environment and manage multiple priorities
- Excellent analytical, problem-solving, and communication skills
Functional Areas: Software/Testing/Networking
Read full job descriptionPrepare for Data Engineer roles with real interview advice
7-15 Yrs
3-10 Yrs
4-10 Yrs
Bangalore / Bengaluru
4-10 Yrs
Bangalore / Bengaluru