3 Synapse XTL Jobs
Synapse XTL - Data Engineer - Python Programming (4-6 yrs)
Synapse XTL
posted 4d ago
Key skills for the job
Company : Synapse - Digital Transformation & Cloud Solutions
Experience : 4-6 years
About the Job :
Synapse is a digital transformation company specializing in managed cloud operations, application development, and data-driven solutions.
We are a highly passionate and innovative technology team, constantly pushing the boundaries to deliver enterprise-level solutions.
Our automated multi-cloud deployment and monitoring systems enable real-time threat monitoring and data visualization at scale.
We are looking for a Data Engineer (Python) to design, develop, and maintain scalable data pipelines and infrastructure while ensuring seamless data flow into our Snowflake data warehouse.
If you have expertise in ETL processes, big data technologies, and data visualization, this role is perfect for you!
Roles and Responsibilities :
Develop and maintain scalable data pipelines using Python and related technologies.
Extract, transform, and load (ETL) data from various sources such as SQL/NoSQL databases, APIs, and cloud storage (AWS S3, Azure Blob Storage, Google Cloud Storage).
Implement data quality checks and monitoring to ensure accuracy and integrity.
Design and maintain data warehouses and data lakes, optimizing for performance and scalability.
Collaborate with data scientists, analysts, and other stakeholders to define data requirements and implement efficient technical solutions.
Document data pipelines and engineering processes for transparency and knowledge sharing.
Participate in code reviews and engineering best practices to enhance system reliability and performance.
Troubleshoot and resolve data-related issues, staying updated with emerging trends in data engineering.
Contribute to the design and implementation of data governance policies and best practices.
Skills and Qualifications :
Required :
- 4-6 years of experience in Data Engineering with strong Python programming skills.
- Hands-on experience with ETL development, data integration, and data transformation.
- Proficiency in SQL and NoSQL databases for data extraction and optimization.
- Experience with cloud storage and computing platforms (AWS, Azure, or Google Cloud).
- Strong understanding of data pipeline orchestration and automation tools.
- Experience in building and optimizing data warehouses or data lakes.
- Knowledge of data governance, data security, and compliance best practices.
- Excellent problem-solving skills, communication, and teamwork abilities.
Preferred :
- Experience with Snowflake, Apache Spark, Airflow, or dbt.
- Familiarity with CI/CD pipelines for data engineering.
- Exposure to big data frameworks and data streaming technologies
Functional Areas: Software/Testing/Networking
Read full job description