i
Smartedge IT Services
103 Smartedge IT Services Jobs
Data Engineer - Snowflake DB (3-7 yrs)
Smartedge IT Services
posted 2mon ago
Fixed timing
Key skills for the job
Job Summary :
We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have hands-on experience in designing, building, and maintaining scalable data pipelines and architectures using Snowflake, Python, and Databricks. This role requires a strong foundation in data engineering practices, cloud technologies, and advanced data processing workflows.
Key Responsibilities :
1. Data Pipeline Development :
- Design, develop, and optimize ETL/ELT processes to ingest, transform, and load large datasets into Snowflake.
- Automate data workflows using Databricks and Python.
2. Data Architecture & Modeling :
- Build and maintain scalable, reliable, and efficient data architectures for real-time and batch data processing.
- Develop data models and schemas in Snowflake to support business intelligence and analytics requirements.
3. Collaboration :
- Work closely with data analysts, data scientists, and business stakeholders to understand data needs and implement effective solutions.
- Collaborate with cross-functional teams to design data integration strategies and ensure data consistency across platforms.
4. Performance Optimization :
- Monitor and improve the performance of data pipelines and Snowflake environments to ensure seamless data accessibility.
- Implement best practices for data governance, security, and compliance.
5. Tool Integration :
- Leverage Databricks for advanced data transformations and machine learning pipelines.
- Integrate third-party APIs and tools for enhanced data processing capabilities.
Required Skills & Qualifications :Education : Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.
Technical Expertise :
- Proficient in Snowflake architecture, data modeling, and performance optimization.
- Strong programming skills in Python, including libraries like Pandas, NumPy, and PySpark.
- Experience with Databricks for data engineering and analytics workflows.
- Solid understanding of SQL and database systems.
Cloud Experience : Familiarity with cloud platforms such as AWS, Azure, or GCP.
Data Workflow Tools : Experience with tools like Apache Airflow or other orchestration platforms is a plus.
Problem-Solving : Strong analytical and problem-solving skills with the ability to troubleshoot data issues effectively.
Preferred Qualifications :
- Knowledge of big data technologies (e.g., Hadoop, Spark).
- Experience with CI/CD pipelines for data deployment.
- Certification in Snowflake or Databricks is a plus.
Why Join Us?
- Opportunity to work on cutting-edge data technologies.
- Collaborative and innovative work environment.
- Competitive salary, benefits, and professional growth opportunities.
Functional Areas: Software/Testing/Networking
Read full job description