Job Title- Data Engineer (ETL, Big Data, Hadoop, Spark, GCP)
Corporate Title: AVP
Location- Pune, Margarpatta
Role Description Senior Engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the Bank. Root cause analysis skills, developed through addressing enhancements and fixes to products. Build reliability and resiliency into solutions through early testing, peer reviews and automating the delivery lifecycle.
Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Should be able to work in a cross-application, mixed-technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working along-side a geographically dispersed team.
The position is required as part of the build out of AFC Tech internal development team in India. The overall team will primarily deliver improvements in anti-financial crime capabilities that are major components of the regulatory portfolio, addressing various regulatory commitments to mandated monitors.
What we ll offer you
As part of our flexible scheme, here are just some of the benefits that you ll enjoy
Best in class leave policy
Gender neutral parental leaves
100% reimbursement under childcare assistance benefit (gender neutral)
Sponsorship for Industry relevant certifications and education
Employee Assistance Program for you and your family members
Comprehensive Hospitalization Insurance for you and your dependents
Accident and Term life Insurance
Complementary Health screening for 35 yrs. and above
Your key responsibilities
Analysing data sets and designing and coding stable and scalable data ingestion workflows, also integrating into existing workflows.
Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution.
Work as a senior developer for developing analytics algorithm on top of ingested data.
Work as a senior developer for various data sourcing in Hadoop, also GCP.
Own unit-testing, UAT deployment, End-User sign-off Prod Go-Live.
Ensuring new code is tested, both at unit level and at system level.
Design, develop and peer review new code/functionality.
Operate as a team member of an Agile Scrum team.
Your Skills Experience
More than 8 years of coding experience in reputed organizations.
Hands on experience on BitBucket Jenkins actions.
Proficient in Hadoop, Python, Spark, SQL, Unix, Pentaho, Hive.
Basic understanding of On-Prem, Edge GCP data security.
Hands on development experience on large ETL/Big Data systems, GCP being a big plus.
Hands on experience on CloudBuild, ArtifactRegistry, CloudDNS, CloudLoadBalancing etc.
Hands on experience on DataFlow, CloudComposer, Cloud Storage, Data Proc etc.
Basic understanding of Data Quality dimensions like Consistency, Completeness, Accuracy, Lineage etc.
Hands on business and systems knowledge gained in a regulatory delivery environment.
Banking experience, regulatory and cross-product knowledge.
Passionate about test driven development.
Data visualisation experience in Tableau.
How we ll support you
Training and development to help you excel in your career
Coaching and support from experts in your team
A culture of continuous learning to aid progression
A range of flexible benefits that you can tailor to suit your needs
About us and our teams
Please visit our company website for further information:
https: / / www.db.com / company / company.htm
We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively.
Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group.
We welcome applications from all people and promote a positive, fair and inclusive work environment.