70 SOFTPATH TECHNOLOGIES Jobs
7-9 years
Softpath Technologies - Snowflake Developer - Python/ETL (7-9 yrs)
SOFTPATH TECHNOLOGIES
posted 3d ago
Fixed timing
Key skills for the job
Job Title : Snowflake Developer with Python
Location : Chennai / Bangalore / Pune / Mumbai
Experience : 7+ years (with at least 5 years in Snowflake)
Job Description :
We are seeking a highly skilled and experienced Snowflake Developer with Python to join our team. The ideal candidate will possess strong experience in Data Warehousing, ETL processes, and Business Intelligence projects, with a proven expertise in Snowflake and Python/PySpark development. You will be responsible for developing and managing data pipelines, writing complex stored procedures, and ensuring optimal performance of data processes across the data warehouse environment.
Key Responsibilities :
- Snowflake Development : Design, develop, and optimize data solutions in Snowflake. The candidate must have an in-depth understanding of Snowflake architecture and its various features like data sharing, zero-copy cloning, and Snowflake's integration with other cloud services.
- ETL and Data Modeling : Lead the ETL development process to efficiently manage and transform data into Snowflake. You will be expected to work on standard Data Warehouse (DWH) concepts, ensuring smooth data flow and transformation between multiple data sources.
- Python / PySpark Development : Utilize Python and PySpark for creating automated data processing scripts, building complex data pipelines, and optimizing the performance of large datasets.
- Stored Procedures : Implement and optimize complex stored procedures in Snowflake, focusing on performance and scalability. Ensure efficient execution of data transformations and business logic within the Snowflake environment.
- Oracle and PL/SQL : Leverage your expertise in Oracle databases, writing complex PL/SQL queries, and optimizing data processing tasks. You should have the ability to migrate, troubleshoot, and optimize legacy systems into Snowflake.
- Performance Tuning : Monitor and optimize performance across all systems, from Snowflake to ETL processes, with a focus on improving query performance, reducing resource consumption, and scaling operations.
- Unix Shell Scripting : Write and maintain Unix shell scripts to automate tasks, manage system jobs, and work with data files and log management within the data pipelines.
- DevOps for AWS : (Good to Have) Experience with AWS services, such as S3, Lambda, and Glue, for data integration and processing. Create and maintain DevOps templates for efficient deployment and management of AWS services.
- Version Control : Work with GitHub for version control, and collaborate with development teams to ensure effective tracking and management of code changes.
- CI/CD Pipelines : Experience in setting up and managing CI/CD pipelines using Jenkins for automated testing and deployment.
- Communication : Communicate effectively with cross-functional teams, including business analysts, data scientists, and project managers, to ensure alignment on data requirements and project goals.
- Troubleshooting & Support : Provide ongoing troubleshooting and support for all data processes. Identify root causes of performance issues and resolve them quickly.
- Collaboration and Documentation : Collaborate with other teams on data architecture, design decisions, and best practices. Ensure all code is well-documented for ease of future maintenance and enhancements.
Required Skills :
- Snowflake : Minimum of 5 years of hands-on experience with Snowflake, including working with its architecture, performance tuning, query optimization, and managing Snowflake's capabilities in cloud environments.
- Data Warehousing (DWH) and ETL : Over 7 years of experience in Data Warehousing and ETL processes, developing robust solutions to manage large volumes of data.
- Python / PySpark : At least 3 years of experience with Python and PySpark, developing data transformation scripts, automating tasks, and handling large-scale data processing.
- Stored Procedures : Strong experience in writing, optimizing, and debugging complex stored procedures in Snowflake.
- PL/SQL and Oracle : Proficiency in Oracle databases and complex PL/SQL queries, working with large data sets, and ensuring the optimization of data processing systems.
- Unix Shell Scripting : Experience in writing Unix shell scripts for automating administrative tasks, managing data files, and system processes.
- AWS Services : (Good to have) Familiarity with AWS services (S3, Lambda, Glue, etc.), deploying data solutions, and creating DevOps templates for these services.
- DevOps Tools : Proficiency with GitHub and Jenkins for version control and building automated deployment pipelines for seamless data operations.
- Communication & Analytical Skills : Strong written and verbal communication skills for interacting with business stakeholders and technical teams. Ability to analyze complex data issues and provide efficient solutions.
Desirable Skills :
- Snowflake Certification : A Snowflake certification (such as Snowflake Certified Data Engineer) is highly desirable and will be considered a strong asset.
- Experience with Data Integration Tools : Familiarity with other cloud-based data integration tools like AWS Glue, Talend, or Apache Airflow is a plus.
Qualifications :
- Bachelor's or Master's degree in Computer Science, Engineering, Information Technology, or a related field.
- A minimum of 7+ years of experience in the IT industry, with a strong focus on data warehousing, ETL, and BI projects.
- At least 5+ years of hands-on experience with Snowflake and in-depth knowledge of its architecture, development practices, and advanced features.
Functional Areas: Other
Read full job description7-12 Yrs
5-7 Yrs
6-10 Yrs
3-5 Yrs
3-5 Yrs
6-8 Yrs
5-8 Yrs