i
HashedIn by Deloitte
Proud winner of ABECA 2024 - AmbitionBox Employee Choice Awards
6 HashedIn by Deloitte Jobs
·
2-7 years
Pune, Gurgaon / Gurugram, Bangalore / Bengaluru
30 vacancies
Senior Python Software Engineer
HashedIn by Deloitte
posted 16d ago
Flexible timing
Key skills for the job
POSITION Software Engineer II
LOCATION Bangalore, Bangalore, Pune, Hyderabad, Kolkata, Mumbai, Gurugram, Chennai, Delhi,NCR
SKILLS & REQUIREMENTS
• Good knowledge of a web framework, preferably Python Flask or similar.
• Experience in cloud-based CI/CD service offerings from cloud service providers, preferably
AWS CodeBuild, CodePipeline.
• Experience with various other AWS services like EC2, S3, Lambda, Step Functions, Glue, SNS, SQS,
Secret Manager, etc.
• Strong understanding of basics of SQL reading and writing SQL queries, basic
understanding of database interaction tools (like pgAdmin), SQL, columnar databases, and
database optimization techniques including indexing, etc. Also, good to have knowledge on
AWS Aurora.
• Experience with Big Data frameworks like Hadoop and Spark is a plus.
• Good knowledge of API development and testing including but not limited to HTTP, RESTful
services, Postman, and allied cloud-based services like API Gateway.
• Strong coding/debugging/problem-solving abilities and should have good knowledge of
Python. Good to have experience with pip, setuptools, etc.
• Should have an eye for architecture. Candidates should understand the trade-off between
architectural choices, both on a theoretical level and an applied level.
• Technical background in data with a deep understanding of issues in multiple areas such as
data acquisition, ingestion and processing, data management, distributed processing, and
high availability is required
• Quality delivery is the highest priority. Should know about industry best practices and
standards in building and delivering performant and scalable APIs
RESPONSIBILITIES
• Works at the intersection of infrastructure and software engineering by designing and
deploying data and pipeline management frameworks built on top of open-source
components, including Hadoop, Hive, Spark, HBase, Kafka streaming, Tableau, Airflow,
and other cloud-based data engineering services like S3, Redshift, Athena, Kinesis, etc.
• Collaborate with various teams to build and maintain the most innovative, reliable, secure,
and cost-effective distributed solutions
• Design and develop the backend for efficient CRUD operation on data that can span more
than a million records and growing.
• Deliver the most complex and valuable components of an application on time and to the
specifications
• Plays the role of a leader or individual Contributor who influences a sizable portion of an
account or small project in its entirety, demonstrating an understanding of and consistently
incorporates practical value with theoretical knowledge to make balanced technical
decisions
• Recognizes requirements inconsistencies, accurately schedules, and track progress
providing visibility and proactively alerts the team and reporting authority on the same
• Excellent work breakdown and estimation. Writes clear and concise specifications for
outsourced work, creates a work breakdown structure that uses existing services to deliver a
functional implementation, and supports the development team with significant product
decisions; seen as a major contributor to architecture, feature set, etc., of product releases
• Actively participates in customer communication, presentations, and handling critical issues • Leads assigned client and company resources in performing their roles on time and within
budget
• This role would be an individual contributor who is a role model for the application of team
software development process and deployment process, also contributes to best practices
and methodologies for the greater team
Technical Skills (Minimum):
• Proficiency in Python.
• Experience with a Python web framework like Django or Flask.
• Experience with RDBMS like PostgreSQL, MySQL or Oracle.
• Experience with AWS basic services like EC2, S3 and RDS.
Technical Skills (Good to Have):
• Experience with AWS serverless services like Lambda, Step Functions, EMR/Glue, API
Gateway, etc.
• Advanced knowledge of RDBMS optimizing queries using query plans, indexes,
PL/SQL.
• Experience in Big Data frameworks like Spark and Hadoop.
• Experience working with CI/CD systems, preferably AWS CodePipeline and CodeBuild.
Employment Type: Full Time, Permanent
Read full job descriptionPrepare for HashedIn by Deloitte roles with real interview advice
2-7 Yrs
Pune, Gurgaon / Gurugram, Bangalore / Bengaluru
8-13 Yrs
Kolkata, Hyderabad / Secunderabad, Bangalore / Bengaluru
2-7 Yrs
Hyderabad / Secunderabad, Pune, Bangalore / Bengaluru
1-5 Yrs
Hyderabad / Secunderabad, Pune, Gurgaon / Gurugram +1 more
2-6 Yrs
Hyderabad / Secunderabad, Pune, Gurgaon / Gurugram +1 more