Building highly available and secure authentication and API services.
Maintaining and evolving mission-critical internal databases and services.
Optimizing and operating high volume auto-scaling streaming data services.
Instrumenting streaming data services for visibility into utilization per customer.
Writing and maintaining documentation about internal and public services.
Requirements. Expertise in one or more systems/high-level programming languages (e.g. Python, Go, Java, C++) and the eagerness to learn more.
Experience running scalable (thousands of RPS) and reliable (three 9's) systems.
Experience with developing complex software systems scaling to substantial data volumes or millions of users with production quality deployment, monitoring, and reliability.
Experience with large-scale distributed storage and database systems (SQL or NoSQL, e.g. MySQL, Cassandra).
Ability to decompose complex business problems and lead a team in solving them. Data Processing experience with building and maintaining large scale and/or real-time complex data processing pipelines using Kafka, Hadoop, Hive, Storm, or Zookeeper. 8+ years of experience.