40 Hero FinCorp Jobs
AWS Tech Lead - Data Engineering
Hero FinCorp
posted 13hr ago
Flexible timing
Key skills for the job
Key Responsibilities :
- Should be able to provide Data Pipeline architecture, develop / review the process design
- Data Model Designing, Developing and maintaining Data pipelines on Databricks using cloud (AWS Platform)
- Hands-on experience with using Cloud Platform provided Big Data technologies like Databricks, EC2, EMR, RedShift, S3, Kafka.
- Translate business needs to technical specifications and framework
- Manage the development and support data engineers for business data mart
- Manage / conceptualize data quality process to make sure the data correctness
- Manage the data governance took like BigID
- Experience designing, building and maintaining data architecture and warehousing using Databricks over AWS
- Should have experience in visualization/dashboards tools like PowerBI, QlikSense, Tableau
- Design and manage the data archival policy according to the business requirement
- Develop sub-marts using SQL and OLAP function to fulfil immediate/ad-hoc need of the business users basis the comprehensive marts
- Monitoring of the performance of ETL and Mart Refresh processes, understand the problem
- areas and open a project to fix the performance bottlenecks.
- Analytical mindset with a problem-solving approach
Other Responsibilities (If Any) :
- Availability during month-end Deck generation, may be sometime during week-end/holidays.
Eligibility Criteria for the Job :
Education : B.E/B.Tech in any specialization, BCA, M.Tech in any specialization, MCA
- Age No Bar
- Work Experience
- 10+ years of experience in data engineering on Databricks platforms over AWS/Azure
- Exposure with working on BFSI domain / big data warehouse project
- Exposure to manage multiple source of the information, both structured / unstructured data
- Manage data lake environment for point in time analysis (SCD Type 2), multiple refresh during the day, event based refresh, near real time refresh
- Should have exposure on Managing environment having real time dashboard, data mart requirement.
Primary Skill :
- Must have orchestrated using any of the cloud platforms, preferred Databricks
- Expert in Python and Pyspark coding
- Expert in writing complex SQL Command using OLAP
- Working experience on BFSI Domain
Technical Skills :
- Must have orchestrated at least 3 projects using any of the cloud platforms (GCP, Azure, AWS etc. ) is a must.
- Must have orchestrated at least 2 projects using Databricks over AWS/Azure
- Mush have worked on any cloud PaaS/SaaS database/DWH such as AWS redshift/ Big Query/ Snowflake
- Expertise in programming languages like Python or Pyspark or Scala or SQL/PLSQL.
- Python/Spark Hands - on Exp from data engineering perspective is a must
- Experience in at least one of the major ETL tools (Talend + TAC, SSIS, Informatica) will be added advantage
- Experience with Github and Jenkins CI/CD or similar Devops tools.
- Experience with ETL tooling, migrating ETL code from one technology to another will be a benefit
- Hands-on experience in Visualization/Dashboard tools (PowerBI, Tableau, Qliksense)
Management Skills :
- Ability to handle given tasks and projects simultaneously in an organized and timely manner.
- Lead the team of engineers and manage their task toward achieving the project
- Daily project updates and should be able identify the project risks & mitigations.
- Manage fortnightly updates for the Sr. Management of both the sides
- Up to date project plan, project documentation
- Resource planning
Soft Skills :
- Good communication skills, verbal and written.
- Attention to details.
- Positive attitude and confident
Employment Type: Full Time, Permanent
Read full job descriptionPrepare for Technical Lead roles with real interview advice
8-12 Yrs
Bangalore / Bengaluru, Delhi/Ncr