We are seeking an experienced Python Developer to join our team and help us develop and maintain various software products. The ideal candidate should have a solid understanding of Python programming and practical experience with stream processing technologies. You will be responsible for developing, optimizing, and maintaining real-time data applications, ensuring high performance and scalability. Keywords: Python, FastAPI, Kafka
Department: Engineering (IND)
Job Responsibilities:
Develop, test, and maintain scalable Python applications.
Design, implement, and optimize real-time data processing pipelines.
Collaborate with cross-functional teams to define, design, and ship new features and implement robust data processing pipelines.
Build and manage event-driven architectures and streaming pipelines.
Troubleshoot and debug applications to optimize performance and reliability.
Optimize and maintain existing codebases, ensuring high performance and reliability.
Integrate third-party services and APIs into Python applications.
Conduct code reviews and contribute to improving the quality of code and best practices.
Ensure software is up-to-date with the latest industry trends and technologies.
Primary skills:
4-6 years of experience in Python development and 3rd party libraries (FastAPI).
Hands-on experience with stream processing frameworks (Kafka, Apache Flink, etc.). Good understanding of Kafka.
Experience working with message brokers.
Hands-on experience of asynchronous programming and event-driven systems.
Familiarity with microservices architecture.
Solid understanding of data structures and algorithms.
Understanding of containerization tools such as Docker and Kubernetes.
Experience with version control systems like Git.
Experience with relational and NoSQL databases.
BE/B.Tech degree in engineering
Secondary Skills:
Familiarity with Docker and Kubernetes for container orchestration.
Experience with RESTful APIs development and integration.
Exposure to CI/CD pipelines and automated testing.
Exposure to monitoring and observability tools such as Prometheus, Grafana, etc.
Knowledge of cloud platforms such as AWS, Azure, or GCP. What we offer