10 Staffzo Consulting Jobs
DataOps & DevOps Developer (5-15 yrs)
Staffzo Consulting
posted 5d ago
Job Description :
- Design and implement data transformation pipelines using Python, Spark, AWS Glue, Lambda, and other relevant tools.
- Ensure data quality and accuracy throughout the transformation process.
- Develop and maintain Terraform modules to provision and manage AWS data services, including Glue, Lambda, LakeFormation, and others.
- Implement robust and scalable infrastructure solutions using Terraform.
- Create and maintain CI/CD pipelines to automate the deployment of Terraform code and infrastructure changes.
- Integrate with version control systems (e., Git) and orchestration tools (e. , Jenkins, GitLab CI) for seamless deployments.
- Collaborate with AWS architects, team leads, data engineers, and other stakeholders to understand and define infrastructure and functional requirements for the data workstream.
- Identify and recommend suitable AWS resources to meet the project's data processing and storage needs.
- Collaborate with DevOps engineers to design, build, and maintain the data workstream environment, ensuring security, scalability, and performance.
- Troubleshoot and resolve any infrastructure-related issues within the data environment.
- Provide ongoing support and maintenance for the data infrastructure and pipelines.
- Monitor system performance, identify and resolve bottlenecks, and implement optimizations.
Required Skills & Experience :
- Deep understanding and hands-on experience with core AWS data services, including EMR, Glue, Spark, Data Lake, LakeFormation, IAM, KMS, Lambda, SQS, Athena, Kinesis Firehose/Kafka, S3, EventBridge, Aurora Postgres, DynamoDB, Redis, Snowflake, and other relevant services.
- Extensive experience in deploying and managing infrastructure using Terraform.
- Proficiency in writing, testing, and maintaining Terraform modules.
- Strong experience in designing and implementing complex data transformations using ETL services like Glue and Spark.
- Proficiency in Python for data manipulation, ETL processes, and data analysis.
- Understanding of data modeling principles and experience with Data Definition/Description Language (DDL).
- Knowledge of AWS security best practices and experience with implementing security controls within the AWS environment.
- Excellent communication and collaboration skills to effectively work with cross-functional teams.
- Ability to clearly articulate technical concepts to both technical and non-technical audiences.
Functional Areas: Other
Read full job description