Bachelors degree or higher in Computer Science, Engineering, or a related field. 10+ years of experience in data engineering, with a strong focus on designing and building data pipelines and infrastructure. Proficient in SQL and Python, with the ability to translate complexity into efficient code. Experience with data workflow development and management tools (dbt, Airflow). Solid understanding of distributed computing principles and experience with cloud-based data platforms such as AWS, GCP, or Azure. Strong analytical and problem-solving skills, with the ability to effectively troubleshoot complex data issues. Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team environment. Experience with data tooling, data governance, business intelligence and data privacy is a plus.