The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We re looking for a seasoned engineer to help us build next-generation data export pipelines that provide near real-time streaming of security insights and intelligence data using open source messaging and RESTful APIs that allow programmatic access to the data.
You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics
Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products
You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills.
What you will be doing
Building next generation data streaming service to provide near real-time streaming of security insights and intelligence data
Architecting highly scalable data systems and solving real-time stream challenges at scale.
Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers.
Evaluating open source technologies to find the best fit for our needs and also contributing to some of them!
Participating in the development of related automated test infrastructure and CI/CD integration.
Helping other teams architect their systems on top of the data platform and influencing their system design.
This is a great opportunity to work with smart people in a fun and collaborative environment.
Required skills and experience
8+ years of industry experience building highly scalable distributed Data systems
Programming experience in Python, Java or Golang
Excellent data structure and algorithm skills
Proven good development practices like automated testing, measuring code coverage.
Experience with Kubernetes, Docker, Helm etc
Experience with distributed datastores like Druid, Mongo, Cassandra, BigQuery or similar
Experience building CI/CD pipelines is a plus.
Excellent written and verbal communication skills.
Flexible to work and collaborate with cross-timezone teams.
Bonus points for contributions to the open source community.
Education
BSCS or equivalent required, MSCS or equivalent strongly preferred