As a Senior Data Engineer, you will be responsible for building and maintaining the entire data infrastructure at Spendflo. This includes designing the data models, creating and optimizing ETL pipelines, and ensuring the scalability, reliability, and performance of our data systems. You will work closely with cross-functional teams to enable data-driven decision-making and provide actionable insights through the reporting layer.
Key Responsibilities:
Own the design, implementation, and management of the data architecture, ensuring scalability, reliability, and performance
Build and optimize end-to-end ETL pipelines to extract, transform, and load data from multiple sources
Develop and maintain data models that support business needs and ensure data integrity
Manage the reporting layer, working with product managers, analysts, and business stakeholders to ensure that data is presented in a meaningful and actionable way
Design and implement data quality processes to ensure the accuracy and consistency of data across the organization
Collaborate with engineering, data science, and product teams to ensure smooth data integration and alignment with business objectives
Optimize data storage and retrieval, ensuring performance at scale and minimal latency
Ensure proper documentation of data models, ETL processes, and reporting systems
Stay up-to-date with the latest data engineering tools, technologies, and best practices
Mentor and guide junior data engineers in the team, helping them grow their skills and understanding of best practices
Required Qualifications:
5+ years of experience as a Data Engineer, with a strong understanding of building data pipelines and managing large-scale data systems
Extensive experience with ETL frameworks and data processing tools (e.g., Dagster, dbt, etc.)
Strong experience in designing and optimizing data models and architectures for performance and scalability
Proficiency in data storage technologies (e.g., AWS Redshift, Snowflake, BigQuery, PostgreSQL, etc.)
Expertise in SQL and other data manipulation languages, with the ability to write complex queries and optimize them for performance
Strong knowledge of cloud platforms (AWS, GCP, or Azure) and related data engineering services
Familiarity with modern reporting and business intelligence tools (e.g., Looker, Tableau, Power BI, etc.)
Experience in managing data for analytics, reporting, and machine learning purposes
Ability to work in a fast-paced, dynamic environment and collaborate with multiple stakeholders across teams
Strong communication skills, with the ability to explain technical concepts to non-technical stakeholders