Turn the company s product and technical vision into a tangible roadmap
Create and maintain exceptional system documentation
Take control of the data pipelines by architecting and deploying new solutions, ensuring high uptime, and scaling them up to manage increased traffic
Continuously improve the software engineering practices and mentor junior engineers
Write tests, practice peer code review, pair programming, and continuous deployment on all the code and systems
Job Requirements:
Bachelor s/Master s degree in Engineering, Computer Science (or equivalent experience)
At least 3+ years of relevant experience as a data engineer
Prolific experience working with Airflow, Data Pipelines, Big Data, and Distributed Systems
Experience with Ruby, Python, JavaScript, SQL, and R
Familiarity with GCP and Clean Architecture is desirable
Experience working with large distributed infrastructure, consisting of hundreds of systems ranging from scrapers, data processing clusters, database and API servers, etc.