5 Zifcare Jobs
Data Engineer (4-6 yrs)
Zifcare
posted 15hr ago
Flexible timing
Key skills for the job
Designation : Data Engineer
Experience : 4+ years
Location : Udaipur, Rajasthan
Job Description :
Key Responsibilities :
- Must have strong technical skills in data engineering, including development using tools and technologies such as cloud-based ETL/ELT services, distributed computing platforms, cloud storage, orchestration tools, and data warehouses.
- Extensive hands-on experience in the implementation of Data Lakes and Data Warehouses using industry standard tools and technologies.
- Design, develop, and optimize scalable and reliable data pipelines using programming languages and cloud services.
- Implement ETL/ELT processes to extract, transform, and load data from various structured and unstructured data sources into centralized repositories (e., data lakes or data warehouses).
- Collaborate with cross-functional teams to gather and analyze business requirements, and translate them into scalable data solutions.
- Monitor, troubleshoot, and enhance data workflows for improved performance, cost-efficiency, and scalability.
- Ensure data quality, consistency, and accuracy by implementing robust validation and governance practices.
- Follow data security best practices to comply with organizational and regulatory policies.
- Automate repetitive data engineering tasks using programming frameworks, scripting, or workflow automation tools.
- Leverage CI/CD pipelines for deploying and maintaining data workflows in a cloud environment.
Required Skills and Qualifications :
Professional Experience : 5+ years of experience in data engineering or a related field.
- Strong proficiency in programming languages such as Python, with experience in libraries like pandas, Spark (PySpark), or equivalent.
- Hands-on experience with cloud platforms and their data engineering tools, such as :
1. ETL/ELT Services : AWS Glue, Azure Data Factory, GCP Dataflow, or Apache NiFi.
2. Storage Solutions : Amazon S3, Azure Blob Storage, GCP Cloud Storage, or HDFS.
3. Data Warehousing/Querying : Redshift, Azure Synapse, BigQuery, or Snowflake.
4. Serverless Compute : Lambda, Azure Functions, GCP Cloud Functions.
5. Data Streaming Services : Kafka, AWS Kinesis, Azure Event Hubs, or GCP Pub/Sub.
6. Data Processing : Experience with big data frameworks like Hadoop, Apache Spark, or equivalent.
7. Workflow Orchestration : Hands-on experience with tools such as Apache Airflow, Prefect, Azure Data Factory Pipelines, or GCP Workflows.
Soft Skills : Strong communication and collaboration skills to work with cross-functional teams effectively.
Functional Areas: Software/Testing/Networking
Read full job description3-5 Yrs
Bangalore / Bengaluru, Remote