19 Hero FinCorp Jobs
7-9 years
Delhi ncr, Bangalore / Bengaluru
Hero FinCorp - Data Architect - Python/Spark (7-9 yrs)
Hero FinCorp
posted 1mon ago
Flexible timing
Key skills for the job
Responsibilities for the job :
Key Responsibilities :
- Architectural Leadership: Design and implement scalable data architectures using Databricks, ensuring they meet the specific needs of financial services such as risk management, compliance, and customer analytics.
- Conduct assessment of existing data architectures, identify areas of improvement, and recommend remediation plans to optimise data management process and systems.
- Develop data migration strategies and oversee the execution of data migration activities, ensuring data integrity and minimal disruption to client operations.
- Data Pipeline Development: Build and optimize data ingestion and processing pipelines to facilitate real-time analytics and reporting.
- Collaboration with Stakeholders: Work closely with business units to understand their data requirements and translate them into technical specifications that align with business goals.
- Compliance and Governance: Ensure data solutions adhere to financial regulations and best practices in data governance, including data security and privacy standards.
- Performance Optimization: Monitor and tune the performance of data systems to ensure efficiency and reliability, addressing any bottlenecks or issues that arise.
Mentorship and Training: Provide guidance to junior data engineers and analysts, sharing best practices and promoting a culture of continuous improvement
Other Responsibilities (If Any):- Availability during month-end Deck generation, may be sometime during week-end/holidays.
Eligibility Criteria for the Job
Education : B.E/B. Tech in any specialization, BCA, M.Tech in any specialization, MCA
Age : No Bar
Work Experience :
- Proven experience, 7 years as a Data Architect, Data Solutions Architect, or similar role in a consulting environment.
- Strong hands-on experience on various cloud services like ADF/Lambda, ADLS/S3, Security, Monitoring, Governance & Compliance.
- Hands-on Experience to design platform and build- Databricks- based solution on any cloud platform as well as design and build solution powered by- DBT- models and integrate with databricks.
- Must have good: - End-to-End design solution on cloud platform.
- Knowledge of Data Engineering concept and related services of cloud.
- Experience in Python and Spark and setting up development best practices.
- Knowledge of docker and Kubernetes.
- Experience with claims-based authentication (SAML/OAuth/OIDC), MFA,- RBAC, SSO etc.
- Knowledge of cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls, SIEM, etc.
Primary Skill :
- Effectively communicate complex technical concepts and recommendations to non-technical stakeholders, including C-level executives and business leaders, to facilitate decision-making and secure buy-in.
- The ability to lead the design, implementation, and optimization of our Databricks platform.
- Work closely with our data engineering team to ensure that our Databricks platform is optimized for performance, scalability, and reliability.
- Develop and maintain a comprehensive understanding of our data pipeline and data architecture.
- Develop and maintain documentation for our Databricks platform, including architecture diagrams, deployment guides, and operational procedures.
- Experience in a Platform architecture role using service and hosting solutions such as private/public cloud IaaS, PaaS, and SaaS platforms.
- Provide guidance and support to our data engineering team on Databricks-related issues.
- Maintains close awareness of new and emerging technologies and their potential application for service offerings and products.
Technical Skills :
- Must have orchestrated at least 3 projects using any of the cloud platforms (GCP, Azure, AWS etc.) is a must.
- Must have orchestrated at least 2 projects using Databricks over AWS/Azure
- Mush have worked on any cloud PaaS/SaaS database/DWH such as AWS redshift/ Big Query/ Snowflake
- Expertise in programming languages like Python or Pyspark or Scala or SQL/PLSQL.
- Lead the design and implementation of data artefacts, including data modelling, data integration, data storage, and data governance based on industry best practices and client-specific requirements.
- Hands-on experience on data modelling techniques to create Enterprise Data Models
- Implementing data management concepts (Metadata Management, Data Quality, Data Testing)
- Python/Spark Hands -on Exp from data engineering perspective is a must
- Experience in at least one of the major ETL tools (Talend + TAC, SSIS, Informatica) will be added advantage
- Experience with Github and Jenkins CI/CD or similar Devops tools.
- Experience with ETL tooling, migrating ETL code from one technology to another will be a benefit
- Hands-on experience in Visualization/Dashboard tools (PowerBI, Tableau, Qliksense)
Functional Areas: Software/Testing/Networking
Read full job descriptionPrepare for Data Architect roles with real interview advice
7-9 Yrs
Delhi ncr, Bangalore / Bengaluru
10-16 Yrs
Delhi ncr
3-12 Yrs
Delhi ncr, Gurgaon / Gurugram