Lead the design and engineering of a scalable, secure, and cost-optimized Streaming Data Platform
Lead role in evolving from (current) on-prem to hybrid cloud infrastructure with Kafka as the core technology
Lead platform upgrades and installs. Perform the procedures in Development. Collaborate with Operations team in production.
Develop automation of platform components (infrastructure as code)
Integrate platform metrics into the Syniverse monitoring stack
Execute DB performance analysis. Collaborate with Development teams to provide results and recommendations. This includes queries, DB schema design, and aggregation layers.
Perform Capacity analysis to be applied directly in planning infrastructure design/purchases
Communicate results/recommendations to leadership team
Qualifications :
Bachelor s Degree in Computer Science or equivalent
10+ years relevant experience managing Database, Analytics, and Streaming Platforms.
7+ years of programming with Java, python, or Go
5+ years of Linux experience in both scripting and administrative role
4+ years Kafka with a preference for Confluent platform:
Deploying and managing Brokers, Zookeeper, Schema Registry, ksqlDB, and Connectors
Setup mirroring of Kafka topics data across clusters, DCs, and Cloud
Cloud skills:
Production experience with Public Cloud. AWS strongly preferred.
Production experience deploying Kafka in cloud and on-prem
Production experience deploying cloud native DB services in the cloud
Experience performing assessment of Cloud and on-prem data system
DBA experience with preference for PostgreSQL or similar relational database
Production experience tuning queries and optimizing performance on large scale Analytic DBs
Experience developing automation scripts for infrastructure and platform with Ansible strongly preferred
Demonstrates understanding of Data Model design and tradeoffs
Demonstrates ability to work across teams including Operations, Development, DBA, and Infrastructure admins
Demonstrates clear communication to peers and leadership
Desired:
Production experience with Impala (Cloudera) or Snowflake DB including data ingestion, schema design, and tuning
Experience setting up Grafana and Prometheus for metrics capture
Development of Stream processing applications using Confluent (ksqlDB and Kafka Streams)
Experience in DBA role on Timescale DB
Data pipeline experience using python, Spark Streaming, Stream sets, Kafka Connect, and ksqlDB
Experience deploying on Kubernetes using Containers