50 Response Informatics Jobs
Data Engineer (6-10 yrs)
Response Informatics
posted 1d ago
Fixed timing
Key skills for the job
Experience : 6-10 Years.
Location : Hyderabad.
Notice Period : Immediate to Serving.
- Good experience in Python and pyspark coding, strong experience in AWS Glue and Streaming experience(Spark/kafka).
- Hands On programming experience.
- Implementation Experience on Kafka, Kinesis, Spark, AWS Glue, AWS Lake Formation.
- Experience of performance optimization in Batch and Real time processing applications.
- Expertise in Data Governance and Data Security Implementation.
- Worked on Scheduling tools like Airflow.
Key Responsibilities :
- Design, develop, and implement data pipelines and ETL processes using Python, PySpark, and AWS services (Glue, S3, EMR, etc.).
- Develop and maintain real-time data streaming applications using technologies like Kafka, Kinesis, and Spark Streaming.
- Optimize data processing performance for both batch and real-time applications.
- Implement and maintain data governance and security best practices across all data pipelines.
- Work with scheduling tools like Airflow to orchestrate and monitor data pipelines.
- Collaborate with data scientists and analysts to understand business requirements and translate them into technical solutions.
- Troubleshoot and resolve data quality issues and performance bottlenecks.
- Stay updated on the latest advancements in data engineering technologies and best practices.
Required Skills & Experience :
- Proven experience in developing and optimizing data pipelines using Python and PySpark.
- Hands-on experience with AWS services such as Glue, S3, EMR, Kinesis, and Lake Formation.
- Strong understanding of stream processing concepts and experience with technologies like Kafka, Kinesis, and Spark Streaming.
- Solid understanding of data warehousing, data modeling, and ETL/ELT concepts.
- Experience in optimizing data processing performance for both batch and real-time applications.
- Expertise in implementing data governance and security best practices, including data access control, data masking, and encryption.
- Experience with scheduling tools like Airflow for orchestrating and monitoring data pipelines.
- Excellent communication and collaboration skills to effectively work with cross-functional teams.
Preferred Skills :
- Experience with containerization technologies like Docker and Kubernetes.
- Experience with cloud-native data warehousing solutions like Amazon Redshift or Snowflake.
- Experience with NoSQL databases such as Cassandra or MongoDB.
- Experience with data visualization tools like Tableau or Power BI.
Functional Areas: Software/Testing/Networking
Read full job descriptionPrepare for Data Engineer roles with real interview advice
5-6 Yrs
Kolkata, Mumbai, New Delhi +4 more
7-10 Yrs
Kolkata, Mumbai, New Delhi +4 more
8-12 Yrs
Kolkata, Mumbai, New Delhi +4 more
3-7 Yrs
Kolkata, Mumbai, New Delhi +4 more