Create and deploy an ETL Ingestion Pipeline to import hundreds of millions of healthcare claims to Snowflake-based data warehouse
Think through features from the customers perspective and deliver with the end-users in mind
Work closely with Product, Engineering, Finance, and Operations teams to successfully deliver data and analytical projects
Collaborate with stakeholders to develop an understanding of the data sources that power various parts of the business
Job Requirements:
Bachelor s/Master s degree in Engineering, Computer Science (or equivalent experience)
At least 3+ years of relevant experience as a data engineer
Prolific knowledge of Snowflake, DBT, relational databases, Fivetran, and Python
Core competency in the implementation of the software development lifecycle approaches to database work, Testing, CI/CD, and other automation activities
Demonstrated experience building and maintaining high-throughput ETL pipelines, ideally for a fragmented, non-digital native industry like healthcare
Nice to have some prior experience building Big data pipelines and warehouses to enable machine learning workflows