OverviewThe primary focus would be to lead data architecture activities for critical projects. This role will be responsible for architecting, designing, and implementing Advanced Analytics capabilities and platform oversight within the Azure Data Lake, Databricks, and other related ETL technologies.
Satisfy project requirements adhering to enterprise architecture standards. This role will translate business requirements into technical specifications, including data streams, integrations, transformations, databases, and data warehouses, and implement cutting edge solutions by building Semantic and Virtual data layers to provide faster and federated query execution
ResponsibilitiesACCOUNTABILITIES
Lead Data Architecture for critical data and analytics projectsDrive and Deliver Data Architecture & Solution architecture deliverables such as conceptual, logical, and physical architecture Partner with Enterprise Architecture (Data & Analytics) and ensure the usage of standard patterns Partner with project leads, IT Leads, Security and Enterprise Architecture team members in architecting end to end solutions Gain Architecture alignment and sign-off and guide the project team during implementationQualificationsMANDATORY TECHNICAL SKILLS10+ years of experience in Teradata and Hadoop ecosystem (Ex: Hive, Spark, Kafka, HBase) , Azure cloud technologies3 to 5 years of hands-on experience in architecting, designing, and implementing data ingestion pipes for batch, real-time, and streams on the Azure cloud platform at scale.1 to 3 years of experience in using ETL tools like Informatica, ADF or similar tools, especially for a large volume of data.1 to 3 years of hands-on experience on Databricks.3 to 5 years of working experience on Azure cloud technologies like Spark, IoT, Synapse, Cosmos dB, Log analytics, ADF, ADLS, Blob storage, etc.1 to 2 years of experience on distributed querying tools like Presto or similar tools.1 to 2 years of experience on virtualization tools like Denodo or similar tools.1 to 3 years of experience in evaluating emerging technologies is required.1 to 3 years of experience in Python/ Pyspark/Scala to build data processing applications.Having experience in extracting/ querying/Joining large amounts of data sets at scaleMANDATORY TECH SKILLSHighly analytical, motivated, decisive thought leader with solid critical thinking ability to quickly connect technical and business dotsHas strong communication and organizational skills and can deal with ambiguity while juggling multiple priorities and projects at the same time.MANDATORY NON TECH SKILLSHighly analytical, motivated, decisive thought leader with solid critical thinking ability to quickly connect technical and business dotsHas strong communication and organizational skills and can deal with ambiguity while juggling multiple priorities and projects at the same timeDIFFRENTIATING COMPETENCIES:Experience in data wrangling, advanced analytic modelling, and AI/ML capabilities is preferredFinance functional domain expertise. Employment Type: Full Time, Permanent
Read full job description