25 Devlats Jobs
Data Engineer - AWS Platform (6-20 yrs)
Devlats
posted 12hr ago
Key skills for the job
Job Description :
We are seeking a talented and motivated AWS Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, cloud computing, and a passion for building scalable and robust data solutions on AWS. You will be responsible for designing, developing, and maintaining data pipelines, ensuring data accuracy, and optimizing cloud infrastructure for data processing.
Key Responsibilities :
- Data Pipeline Development : Design, build, and maintain scalable data pipelines using AWS services like AWS Glue, AWS Lambda, and Amazon EMR.
- Data Storage & Processing : Implement and optimize data storage solutions using Amazon S3, Redshift, and DynamoDB.
- ETL Development : Develop efficient ETL processes for extracting, transforming, and loading data from various sources.
- Data Analytics Support : Collaborate with data analysts and scientists to provide clean, reliable, and accessible data.
- Automation & Optimization : Automate repetitive tasks and optimize data workflows to improve performance and cost efficiency.
- Monitoring & Debugging : Implement monitoring solutions to ensure the reliability and accuracy of data pipelines.
- Security & Compliance : Ensure that all data solutions comply with security and data governance policies.
Required Skills :
- Hands-on experience with AWS services such as Glue, Lambda, S3, Redshift, DynamoDB, and EMR.
- Proficiency in programming languages like Python, Scala, or Java.
- Strong SQL skills and experience with database systems.
- Experience with data pipeline tools and frameworks (e.g., Apache Spark, Apache Airflow).
- Familiarity with CI/CD practices and tools for data workflows.
- Understanding of data security and compliance standards.
- Excellent problem-solving and communication skills.
Preferred Qualifications :
- AWS certifications (e.g., AWS Certified Data Analytics, AWS Certified Solutions Architect).
- Experience with real-time data processing frameworks like Apache Kafka.
- Familiarity with machine learning workflows and tools.
- Knowledge of infrastructure-as-code tools like Terraform or CloudFormation.
Functional Areas: Software/Testing/Networking
Read full job description