10 Steer Lean Consulting Jobs
Kafka Administrator (5-7 yrs)
Steer Lean Consulting
posted 29d ago
Flexible timing
Key skills for the job
About the job :
Skills Required :
- 5+ years of experience on Confluent Kafka / 3+ years of experience on Confluent Kafka Cloud Kafka, Control Central, Rest Proxy, HA Proxy, Confluent Kafka Connect, Confluent Kafka Security feature
- 2+ years of experience on Monitoring tools (Prometheus, Grafana / Splunk) and engineering cloud migration solutions.
- Recent experience as Kafka Admin
- Strong experience in building and administering Apache/Confluent Kafka messaging platform.
- Strong experience is designing and building highly available high-volume messaging infrastructure with Apache Kafka on AWS (i.e. stretch cluster, active/active or active/passive) using Mirror Maker or other replication tools.
- Must have experience with Nifi & cloud
- Good experience with Schema Registry, Kafka connectors (source and sink) and KSQL, have worked with Kafka brokers, Zookeeper, Topics, connectors for Setup and administration
- Strong experience in setting up monitoring and management with tools.
- Working knowledge integration of monitoring, management tools and data growth management
- Experience in working with modern IDE's (such as Visual Studio Code, Intellij)
- Admin Core Administration
Responsibilities :
- Architect, implement, and manage a scalable Kafka infrastructure on GKE.
- Develop and maintain comprehensive performance metrics dashboards for the Kafka ecosystem.
- Collaborate with cross-functional teams to optimise Kafka API integration.
- Conduct in-depth assessments of existing AWS Kafka deployments for migration planning.
- Develop and execute a robust migration strategy encompassing data migration and consumer repointing.
- Optimise Kafka performance and scalability for high-throughput data processing.
- Implement robust monitoring and alerting for system health and performance.
- Guarantee data integrity and security throughout the migration process.
- Collaborate with engineering and data teams for seamless integration and optimal performance.
- Design and implement a multi-tenant GKE environment to accommodate diverse workloads.
- Deploy and configure Kafka Strimzi on the underlying infrastructure.
- Identify and implement efficient data ingestion pipelines (e.g., Kafka Connect, Dataflow).
- Conduct thorough fault simulation testing to mitigate potential risks.
- Stay updated on Kafka developments and services to recommend and implement improvements.
- Experience with Kafka monitoring and management tools such as Confluent Control Center, Kafka Manager, or similar.
- Solid understanding of distributed systems, data pipelines, and stream processing.
- Excellent communication and collaboration skills.
- Ability to work independently and manage multiple tasks in a fast-paced environment.
- Knowledge of containerization technologies such as Docker and Kubernetes.
- Familiarity with related technologies such as Apache ZooKeeper, Apache Flink, or Apache Spark.
- Working knowledge of hands-on experience with Strimzi Kafka
Functional Areas: Other
Read full job description5-7 Yrs
Gurgaon / Gurugram
4-6 Yrs
Bangalore / Bengaluru