Design, maintain, and optimize data infrastructure for data collection, management, transformation, and access, focusing on scalability, reliability, and cost-effectiveness.
Continue to be hands-on with data integration engineering tasks, including data pipeline development, ELT processes, data integration and be the go-to expert for complex technical challenges.
Implement, and manage cloud infrastructure and automated workflows using AWS services (e.g., AWS - Step Functions, Batch,Glue, Athena,Lambda, EC2, Event bridge, ECS, Redshift), while optimizing existing orchestration solutions.
Monitor PostgreSQL performance and conduct troubleshooting to identify and resolve issues with database queries, performance bottlenecks, and availability.
Use Python and AWS cloud services to automate data retrieval and processing tasks.
Process Improvement and Efficiency
Identify opportunities for process improvement in data workflows, with a focus on automation and scalability.
Build and manage data warehouses, data lakes, and other data storage solutions to support large-scale data operations and analytics.
Document technical architectures, best practices, and operational procedures for orchestration workflows and automated infrastructure.
Demonstrate a willingness to develop problem-solving skills by participating in root cause analysis, gap analysis, and performance evaluations.
Exhibit strong time management skills and attention to detail, with the ability to manage multiple tasks and priorities in a dynamic environment.
Show eagerness to learn and apply new data analysis techniques, tools, and methodologies.
Ability to thrive in a fast-paced, evolving work environment while taking on new challenges
Collaboration Support:
Work closely with other team members to support ongoing data extraction and data pipeline needs.
Contribute to internal projects by documenting data workflows and helping with ad-hoc data pull requests.