i
Veltris
Work with us
Veltris - Data Architect - ETL/Azure Data Factory (15-19 yrs)
Veltris
posted 10d ago
Flexible timing
Key skills for the job
Designation: Data Architect
Notice Period: Immediate to 30 days maximum.
Location: Hyderabad/ Remote
Mandatory Skills:
- Strong understanding of data architecture and data management principles.
- In-dept knowledge and hands-on experience in Microsoft Azure data stack i.e., Azure Data Factory, Azure Synapse, Azure SQL Database, Azure Functions, Logic apps, Spark Knowledge etc.
- Strong experience with ETL/ELT pipelines, data transformation, and real-time data processing using Spark
- Proficiency in programming languages such as Python, SQL, Spark notebooks (Using Databricks)
- Strong working experience in Databricks, Unity Catalog, DLT tables etc.,
- Must have strong development & design skills in leading Synapse database.
- Experience in designing high-performance, distributed, and scalable database architectures.
- Experience of Big data approach, architectural concepts, data sourcing/ ingestion/ curation and storage mechanisms.
- Experience with real-time / near-real-time Data warehousing concepts and Design approaches.
- 8-10 years of experience in Data Modeling, Data/ Information Architecture, Metadata Management, Master Data management, Data Governance & Data Quality.
- Experience in building and managing data lakes, warehouses, and streaming architectures.
- Experience with big data processing frameworks (Apache Spark, Hadoop, Databricks, EMR).
- Should have executed end-to-end projects focusing on Azure Synapse Delta lake.
- Should have thorough understanding of Data-lakes, Delta-lakes, raw/enriched/curated layer concepts and ETL within Azure framework
- Highly skilled in end-to-end Data warehousing and BI (analytics) design / development.
- Create Design, Functional/Technical specification documentations or any artifacts needed.
- Experience in building and managing data lakes, warehouses, and streaming architectures.
- Experience with big data processing frameworks (Apache Spark, Hadoop, Databricks, EMR).
Good to Have:
- Strong understanding of data security best practices, encryption, and access controls.
- Experience implementing data governance frameworks, auditing, and lineage tracking.
- Familiarity with compliance requirements (GDPR, CCPA, HIPAA, SOC2). Experience working with healthcare data standards (FHIR, HL7, EDI transactions).
- Exposure to AI/ML data pipelines and integrating machine learning models into data workflows.
Functional Areas: Software/Testing/Networking
Read full job description