As a Senior Integration Engineer, you will have an opportunity to build critical Integration infrastructure that surfaces data and insights across the company. The Integration team is really the intersection of business processes with business and platform. You will be responsible for designing, building, and maintaining the integration infrastructure and systems that support Talkdesk s SaaS platforms. The role may include designing and implementing integration solutions, developing and deploying data pipelines, and creating and maintaining data visualization tools.
Core Responsibilities
Design and develop robust and scalable integration pipelines to support data integration using Kafka, Fivetran, Snowflake, Airflow, Starburst and dbt.
Help lead the implementation and maintenance of the data platform solutions, ensuring data integrity, performance, and security.
Collaborate with multi-functional teams including data scientists, analysts, and software engineers to understand data requirements and deliver high-quality solutions.
Evaluate and implement standard practices for data ingestion, ETL processes, and data quality frameworks.
Optimize and tune data processing workflows and SQL queries for improved performance and efficiency.
Contribute to ad hoc projects executed by the business analytics team by structuring integrations, data, conducting analyses, and synthesizing presentations or other report-outs.
Stay up-to-date with industry trends and advancements in integration, continuously improving the teams technical knowledge and skills.
Collaborate with infrastructure and operations teams to ensure reliable and scalable data storage, processing, and monitoring solutions.
Develop an in-depth understanding of our industry on integration platforms like Boomi, Starburst.
What You Bring To The Team
Bachelors degree in Computer Science, Engineering, or a related field. Advanced degree preferred.
Proven experience (5+ years) in data engineering, designing and implementing data pipelines, and building data infrastructure.
Strong expertise in working with Snowflake, Airflow, Starburst and dbt, including data modeling, ETL, and data quality assurance.
Proficiency in SQL and experience with optimizing and tuning queries for performance.
Solid understanding of data warehousing concepts, dimensional modeling, and data integration techniques.
Experience with cloud platforms (e.g., AWS, Azure, GCP) and cloud-based data technologies.
Strong programming skills in Python or other scripting languages for data manipulation and automation.
Excellent problem-solving and troubleshooting abilities with a keen attention to detail.
Strong communication skills with the ability to effectively collaborate with multi-functional teams and partners.
Experience in database maintenance, schema design and relational modeling.
Clear communication about complex technical topics.
Comfortable working in a fast-paced and highly collaborative environment.
Bonus Points
Experience with streaming data processing frameworks (e.g., Apache Kafka, Apache Flink).
Familiarity with containerization technologies (e.g., Docker, Kubernetes).
Knowledge of distributed computing frameworks (e.g., Spark, Hadoop).
Experience with data governance, data security, and compliance practices.
Understanding of DevOps principles and experience with CI/CD pipelines.
Work Environment and Physical Requirements:
Primarily office-environment work, extended periods of sitting or standing, computer-based work. Limited lifting, and equipment usage limited to computer-related equipment (keyboards, mouse, etc.)