Should have a minimum 5+ years in Data Engineering, Data Analytics platform.
Should have strong hands-on design and engineering background in AWS, across a wide range of AWS services with the ability to demonstrate working on large engagements.
Should be involved in Requirements Gathering and transforming them to into Functionally and technical design.
Maintain and optimize the data infrastructure required for accurate extraction, transformation, and loading of data from a wide variety of data sources.
Design, build and maintain batch or real-time data pipelines in production.
Develop ETL/ELT Data pipeline (extract, transform, load) processes to help extract and manipulate data from multiple sources.
Automate data workflows such as data ingestion, aggregation, and ETL processing and should have good experience with different types of data ingestion techniques: File-based, API-based, streaming data sources (OLTP, OLAP, ODS etc) and heterogeneous databases.
Prepare raw data in Data Warehouses into a consumable dataset for both technical and non-technical stakeholders.
Strong experience and implementation of Data lakes, Data warehousing, Data Lakehousing architectures.
Ensure data accuracy, integrity, privacy, security, and compliance through quality control procedures.
Monitor data systems performance and implement optimization strategies.
Leverage data controls to maintain data privacy, security, compliance, and quality for allocated areas of ownership.