6 t3 Strategic Partners Jobs
Principal Data Engineer - Data Pipeline (10-15 yrs)
t3 Strategic Partners
posted 23hr ago
Key skills for the job
The Principal Data Engineer is a senior technical leader within the data engineering team, responsible for designing and implementing scalable data pipelines, architectures, and frameworks to address complex data challenges.
This role ensures the company's data infrastructure aligns with its business objectives, mentors team members, and collaborates with cross-functional teams to build efficient, secure, and scalable systems.
Key Responsibilities :
Data Architecture and Infrastructure Development : .
- Robust Design : Architect high-performance, scalable data systems to facilitate seamless access and analysis.
- Pipeline Development : Build, optimize, and manage ETL workflows for ingesting data from diverse sources.
- Cloud Solutions : Implement data warehousing and big data platforms on AWS, GCP, or Azure.
Optimization of Data Pipelines :
- Efficient Processing : Develop and refine pipelines to handle large-scale structured and unstructured data.
- Automation : Automate data workflows to ensure real-time availability and reduce downtime.
Advanced Data Processing :
- Framework Integration : Use platforms like Spark, Kafka, and Hadoop for real-time and batch processing.
- Model Deployment : Collaborate with data science teams to operationalize machine learning models.
Leadership and Mentorship :
- Team Guidance : Provide mentorship to junior and mid-level engineers, ensuring adherence to best practices.
- Technical Leadership : Set standards for coding, system architecture, and technology selection.
Cross-Functional Collaboration : .
- Stakeholder Engagement : Partner with teams such as data science, product, and customer success to transform business requirements into scalable solutions.
- Data Availability : Facilitate smooth data flow to support analytics, reporting, and decision-making.
Performance and Security : .
- Data Integrity : Ensure systems comply with data security standards and maintain data accuracy.
- Scalability : Optimize storage and retrieval processes to enhance performance and reduce costs.
Innovation and Continuous Improvement :
- Emerging Technologies : Stay current with trends in data engineering and adopt cutting-edge practices.
- System Enhancements : Drive innovation to improve the efficiency and reliability of the data ecosystem.
Essential Skills and Qualifications :
- Technical Proficiency : Expertise in data engineering tools (i.e., Spark, Kafka), cloud platforms (AWS, GCP, Azure), and ETL processes.
- Problem-Solving : Strong analytical skills to resolve complex data challenges effectively.
- Leadership Abilities : Experience mentoring engineers and guiding teams through technical challenges.
- Collaborative Approach : Proven ability to align technical solutions with business goals.
- Data Security Knowledge : Understanding of compliance standards and best practices for data governance.
- This position is ideal for seasoned professionals eager to combine technical innovation with strategic impact.
- If you're preparing for this role, let me know how I can assist further!.
- 10+ years of experience in data engineering or related fields.
- Strong experience in designing and building scalable data pipelines and architectures.
- Extensive hands-on experience with cloud platforms such as AWS, GCP, or Azure.
- Proven experience with big data technologies (i.e., Hadoop, Spark, Kafka, Hive).
- Strong experience with data warehousing solutions (i.e., Redshift, Snowflake, BigQuery. etc.) and Data modelling.
- Must have designed and built at least two scalable systems.
Technical Skills :
- Proficiency in programming languages like Python, PySpark, or Scala.
- Strong experience of Designing Robust APIs and microservices.
- Experience with data pipeline orchestration tools (i.e., Apache Airflow, Kubeflow etc.
- Deep understanding of ETL/ELT processes and data modelling.
Problem Solving & Innovation :
- Strong analytical skills with the ability to solve complex technical challenges.
- Experience in troubleshooting data systems and optimizing their performance.
- Ability to evaluate and recommend new technologies and tools to improve data systems.
Collaboration & Communication :
- Excellent communication skills, with the ability to explain complex technical issues to both technical and non-technical stakeholders.
- Ability to work in a collaborative, cross-functional environment with multiple stakeholders.
- Strong interpersonal skills and experience mentoring other engineers.
Good To Have :
- Hands-on experience with machine learning pipelines and working with data science teams.
- Experience with data governance frameworks and compliance (i.e., GDPR, CCPA).
- Experience in real-time data streaming and processing technologies (e., Kafka, Kinesis, Flink).
- Good knowledge of SQL and database technologies.
- Familiarity with containerization (i.e., Docker) and orchestration (i.e., Kubernetes) is a plus.
- Experience building data pipelines for CPG/Retail industries.
Functional Areas: Other
Read full job description