Upload Button Icon Add office photos
filter salaries All Filters

41 APPIT Software Solutions Jobs

APPIT Software - AWS Data & Solution Engineer (10-12 yrs)

10-12 years

Chennai

APPIT Software - AWS Data & Solution Engineer (10-12 yrs)

APPIT Software Solutions

posted 2mon ago

Job Role Insights

Flexible timing

Job Description

Company Description :

We are APPIT Software Inc., a technology consulting firm specializing in Oracle solutions. Our focus is on delivering IT and engineering solutions to help customers optimize their business systems and protect their IT investments. With a global presence and a development center in India, we offer Consultancy, Implementation, Upgradation, Customization, and Support services.

Role Description :

This is a full-time on-site role as an AWS Data & Solutions Engineer located in Chennai. As part of the team, you will be responsible for designing, implementing, and maintaining AWS data solutions, ensuring scalability, security, and performance.

Your tasks will involve collaborating with cross-functional teams to develop data architecture, ETL processes, and data pipelines to support business requirements.

Sr AWS Data & Solutions Engineer.

Locations : India.

Type : Full-time.

Experience : 10 to 12 years.

Functions : Consulting, Finance, Information Technology, Data Engineering & Analytics.

Industries : Capital Markets, Investment Banking, Alternative Investments, Financial Services, Management Consulting, Information Technology and Services, Business Travel Healthcare.

Job Description :


We are looking for an Sr AWS Data & Solutions Engineer with primary skills on Python & PySpark development who will be able to design and build solutions for one of our Fortune 500 Client programs, which aims towards building an Enterprise Data Lake on AWS Cloud platform, build Data pipelines by developing several AWS Data Integration, Engineering & Analytics resources.

There is no requirement for Machine Learning skills. This is a high visibility, fast-paced key initiative will integrate data across internal and external sources, provide analytical insights, and integrate with the customer's critical systems.

Key Responsibilities :

- Design, build and unit test applications on Spark framework on Python.

- Build Python and PySpark based applications based on data in both Relational databases (e.g. Oracle), NoSQL databases (e.g. DynamoDB, MongoDB) and filesystems (e.g, S3, HDFS).

- Build AWS Lambda functions on Python runtime leveraging pandas, json, boto3, requests, avro libraries.

- Build PySpark based data pipeline jobs on AWS Glue ETL requiring in-depth knowledge on AWS Glue Dynamic Frames and Options.

- Build Python based event-driven integration with Kafka Topics, leveraging Confluent Kafka libraries.

- Design and Build Generic, Reusable utility applications in Python.

- Build the Python programs across Glue ETL jobs and Lambda functions.

- Optimize performance for data access requirements by choosing the appropriate native Hadoop file formats (Avro, Parquet, ORC etc) and compression codec respectively.

- Design & Build S3 buckets, tiers, lifecycle policies, as strategic storage layer for "Data Lake".

- Optimize performance of Spark applications in Hadoop using configurations around Spark Context, Spark-SQL, Data Frame, and Pair RDD's.

- Setup the Glue crawlers in order to catalog OracleDB tables, MongoDB collections and S3 objects.

- Configure Athena tables and SQL views based on Glue Cataloged datasets.

- Ability to monitor, troubleshoot and debug failures using AWS CloudWatch and Datadog.

- Ability to solve complex data-driven scenarios and triage towards defects and production issues.

- Participate in code release and production deployment.

- Create documentation for user adoption, deployments, runbook, and support client users for enablement or for any issues encountered.

- Perform code reviews with the team and enable them to develop code for complex scenarios.

- Participate in the agile development process, and document and communicate issues and bugs relative to data standards in scrum meetings.

- Work collaboratively with onsite and offshore team.

- Voice the opinions to multiple teams and thus driving the entire initiative with strong leadership.

Qualifications :

- Bachelor's Degree or equivalent in computer science or related and minimum 10+ years of experience.

- Certified on one of Solution Architect, Data Engineer or Data Analytics Specialty by AWS.

- Require 3+ hand-on experience on Python and PySpark programming.

- Require 2+ hands-on experience on AWS S3, Glue ETL & Catalog, Lamba Functions, Athena & Kafka.

- Require 1+ hands-on experience on Confluent Kafka integration.

- Require hands-on experience working on different file formats i.e. avro, parquet, orc, json, xml.

- Require hands-on experience on Python pandas, requests, boto3 module.

- Require hands-on experience in writing complex SQL queries.

- Require hands-on experience using REST APIs.

- Require Financial Services industry experience.

- Preferred expertise on Snowflake, AWS Redshift & DynamoDB.

- Ability to use AWS services, predict application issues and design proactive resolutions.

- Require to be part of Production Rollouts of successful implementation of workflows and Collibra products.

- Require Technical Coordination skills to drive requirements and technical design with multiple teams.

- Requires aptitude to help build skillset within organization.


Functional Areas: Other

Read full job description

Prepare for Solution Engineer roles with real interview advice

What people at APPIT Software Solutions are saying

What APPIT Software Solutions employees are saying about work life

based on 21 employees
53%
93%
78%
100%
Flexible timing
Monday to Friday
No travel
Day Shift
View more insights

APPIT Software Solutions Benefits

Soft Skill Training
Job Training
Free Transport
Work From Home
Team Outings
Child care +6 more
View more benefits

Compare APPIT Software Solutions with

Infosys

3.7
Compare

TCS

3.7
Compare

Wipro

3.7
Compare

HCLTech

3.5
Compare

Tech Mahindra

3.6
Compare

LTIMindtree

3.9
Compare

Mphasis

3.4
Compare

Hexaware Technologies

3.6
Compare

Persistent Systems

3.5
Compare

HCL Infosystems

3.9
Compare

Accel Frontline

3.9
Compare

Northcorp Software

4.4
Compare

Diverse Lynx

3.8
Compare

Elentec Power India (EPI) Pvt. Ltd.

3.7
Compare

HyScaler

4.5
Compare

Appsierra

4.3
Compare

Emblix Solutions

4.9
Compare

Solartis Technology Services

3.7
Compare

Trawex Technologies

4.7
Compare

Yashi Consulting Services

3.9
Compare

Similar Jobs for you

AWS Data Engineer at Spruce IT Pvt. Ltd.

5-10 Yrs

₹ 15-20 LPA

Engineer at APPIT SOFTWARE SOLUTIONS PRIVATE LIMITED

8-12 Yrs

₹ 20-30 LPA

Business Intelligence at Talentas Technology

10-15 Yrs

₹ 22-30 LPA

AWS Data Engineer at Neerinfo Solutions

4-8 Yrs

₹ 12-24 LPA

AWS Data Engineer at Crescendo Global

7-12 Yrs

₹ 12-24 LPA

Principal Data Engineer at Kinara Capital.

6-8 Yrs

₹ 18-22 LPA

Data Engineer 3 at INSIGHT DIRECT INDIA LLP

5-8 Yrs

₹ 15-23 LPA

Business Intelligence at APPIT SOFTWARE SOLUTIONS PRIVATE LIMITED

8-12 Yrs

₹ 15-24 LPA

Azure Data Engineer at Techno Wise

Bangalore / Bengaluru, Pune + 1

8-10 Yrs

₹ 15-25 LPA

AWS Data Engineer at Teamware Solutions ( A division of Quantum Leap Co

Bangalore / Bengaluru

5-10 Yrs

₹ 16-22 LPA

write
Share an Interview