112 Nazztec Jobs
Data Architect - Spark/Python (10-12 yrs)
Nazztec
posted 20hr ago
Key skills for the job
Job Title : Data Architect
Location : Bangalore
Experience : 10+ Years
Job Description :
We are looking for an experienced Data Architect to join our team. The ideal candidate will be responsible for designing and implementing on-premise and cloud-based data architectures while ensuring seamless data management solutions.
The candidate will play a crucial role in architecting MS SQL Database Management Solutions, ETL Pipelines, and Cloud Data Solutions using Azure/AWS.
Key Responsibilities :
- Provide on-premise & cloud data architectural solutions and designs to project teams.
- Design and architect MS SQL Database Management Solutions (MS SQL Server DB, SSIS ETL, SSRS Reporting Services, Power BI).
- Develop architectural assessments, strategies, and roadmaps for data management.
- Design and implement data migration patterns from RDBMS to cloud data warehouses.
- Own and drive AWS technology data architecture, including batch, micro-batch, and real-time data streaming.
- Expertise in data modeling, ELT, SSIS/ADF ETLs, Spark & Python, and strong SQL knowledge.
- Support Agile Scrum teams in planning, scoping, and creating technical solutions.
- Collaborate with delivery teams and clients to resolve technical dependencies, issues, and risks.
- Design and configure Cloud Database & ETL solutions.
- Develop Azure/AWS Cloud Data Solutions using an API-first approach.
- Lead data model changes from on-prem RDBMS to cloud platforms.
- Deploy data warehouse features such as data lakes, data marts, and data hubs.
- Work closely with Product Owners, Business Analysts, and cross-functional teams to develop, test, and release features.
Technical and Professional Expertise :
- Deep technical expertise in Microsoft Data Services (Cloud DB and ETL Platforms - Azure / AWS).
- Strong experience in RDBMS (MS SQL Server), ETL (SSIS, ADF), and visualization tools (Power BI, Tableau).
- Hands-on experience with Azure ADF, Databricks.
- Strong knowledge of relational and NoSQL databases and data modeling (star, snowflake, dimensional modeling).
- Ability to troubleshoot and resolve complex data pipeline issues.
- Experience in documenting data architectures, data flows, and data models.
- Knowledge of data security and compliance.
- Experience in architecting and documenting at various levels (conceptual, logical, physical, data flow, and sequence diagrams).
- Ability to translate business and technical requirements into data models.
- Excellent presentation skills with the ability to engage large and small audiences.
Functional Areas: Software/Testing/Networking
Read full job description