Strong experience with data pipeline and workflow management tools, ETL processes, and data warehousing solutions. Experience with data transformation tools like dbt. Experience with stream-processing systems, particularly Kafka, and familiarity with Kafka's ecosystem, including connectors and stream processing frameworks. Knowledge of Snowflake and its integration with data pipelines, as well as experience with snapshotting data for history tracking purposes. Familiarity with cloud services and databases, particularly Azure services and PostgreSQL. Strong programming skills in languages such as Python and Scala for scripting and automation. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication skills to effectively collaborate with team members and stakeholders. Show more Show less