19 Hero FinCorp Jobs
Hero FinCorp - AWS Developer - Data Engineering (3-6 yrs)
Hero FinCorp
posted 1mon ago
Flexible timing
Key skills for the job
Key Responsibilities :
- Designing and implementing highly performant data ingestion & transformation pipelines from multiple sources using Databricks and Spark
- Streaming and Batch processes in Databricks
- SPARK performance\tuning\optimisation
- Providing technical guidance for complex problems and spark dataframes
- Developing scalable and re-usable frameworks for ingestion and transformation of large data sets
- Data quality system and process design and implementation.
- Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times
- Working with other members of the project team to support delivery of additional project components (Reporting tools, API interfaces, Search)
- Evaluating the performance and applicability of multiple tools against customer requirements
- Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
Other Responsibilities (If Any) : Availability during month-end Deck generation, may be sometime during week-end/holidays.
Eligibility Criteria for the Job
Education
- B.E/B.Tech in any specialization, BCA, M.Tech in any specialization, MCA
Age : No Bar
Work Experience :
- 3+ years of experience in data engineering on Databricks platforms over AWS/Azure
Primary Skill :
- Direct experience of building data pipelines using Databricks Spark.
- Building data integration with Python
- Hands on experience designing and delivering solutions using the Azure Data Analytics platform.
- Experience building data warehouse solutions using ETL / ELT tools like Informatica, Talend.
- Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching.
Technical Skills :
- Must have orchestrated at least 3 projects using any of the cloud platforms (GCP, Azure, AWS etc.) is a must.
- Must have orchestrated at least 2 projects using Databricks over AWS/Azure
- Must have worked on any cloud PaaS/SaaS database/DWH such as AWS redshift/ Big Query/ Snowflake
- Expertise in programming languages like Python or Pyspark or Scala or SQL/PLSQL.
- Python/Spark Hands - on Exp from data engineering perspective is a must
- Experience in at least one of the major ETL tools (Talend + TAC, SSIS, Informatica) will be added advantage
- Experience with Github and Jenkins CI/CD or similar Devops tools.
- Experience with ETL tooling, migrating ETL code from one technology to another will be a benefit
- Hands-on experience in Visualization/Dashboard tools (PowerBI, Tableau, Qliksense)
- Experience with ETL tooling, migrating ETL code from one technology to another will be a benefit
- Hands-on experience in Visualization/Dashboard tools (PowerBI, Tableau, Qliksense)
Management Skills :
- Ability to handle given tasks and projects simultaneously in an organized and timely manner.
- Daily project updates and should be able identify the project risks & mitigations.
Soft Skills :
- Good communication skills, verbal and written.
- Attention to details.
- Positive attitude and confident
Functional Areas: Software/Testing/Networking
Read full job descriptionPrepare for AWS Developer roles with real interview advice
10-16 Yrs
Delhi ncr
3-12 Yrs
Delhi ncr, Gurgaon / Gurugram