Mentor and provide technical guidance to junior data engineers in the team.
Contributing back to engineering community by writing/publishing blogs, articles, or papers in respected engineering conferences
Design and develop the data platform to efficiently and cost effectively address various data needs across the business.
Build software across our entire cutting-edge data platform, including event driven and batch data processing, storage, and serving through various data integration techniques such as APIs, Lambda, File Feeds etc.
Think of new ways to help make our data platform more scalable, resilient, and reliable and then work across our team to put your ideas into action.
Help us stay ahead of the curve by working closely with data architects, stream processing specialists, API developers, our DevOps team, and analysts to design systems which can scale elastically .
Work closely with data analysts and business stake holders to make data easily accessible and understandable to them.
Ensure data quality by implementing re-usable data quality frameworks.
Work closely with various other data engineering teams to roll out new capabilities.
Work closely with AI/ML team to operationalize and utilize models for business needs.
Develop and enforce data engineering, security, data quality standards through automation.
Participate in supporting platform 24X7.
Be passionate about growing a team - hire and mentor engineers and analyst.
Be responsible for cloud cost and improving efficiency.
What to Bring :
Bachelor s degree in computer science or similar discipline.
8 + years of experience in software engineering
8 + years of experience in data engineering.
Ability to work in fast paced, high pressure, agile environment.
Expertise in building and managing large volume data processing (both streaming and batch) platform is a must .
Expertise in OLAP databases such as Snowflake or Redshift.
Expertise in SQL, Spark SQL, Hive etc.
Expertise in stream processing systems such as Kafka, Kinesis, Pulsar or Similar
Expertise in building micro services and managing containerized deployments, preferably using Kubernetes.
Expertise in distributed data processing frameworks such as Apache Spark, Flink or Similar.
No-SQL (Apache Cassandra, DynamoDB or similar) is a huge plus.
Experience with Analytics Tools such as Looker, and Tableau is preferred.
Cloud (AWS) experience is preferred.
Ability to learn and teach new languages and frameworks.
Drive to master emerging technologies and share experiences with team members.
Ability to understand goals, strategies, and needs of the business as they relate to infrastructure and application development.
Expert knowledge of best security practices for cloud apps
Expert with CI/CD pipelines, preferably GitHub Actions Workflows.
Terraform development experience.
Proficient with C#, Python/Java.
Experience working with containerization platforms, such as Docker, EKS.