i
Smartedge IT Services
108 Smartedge IT Services Jobs
Data Architect - Python/DBT (13-15 yrs)
Smartedge IT Services
posted 4d ago
Fixed timing
Key skills for the job
Job Title : Data Architect.
Location : Pune, Hyderabad, Chennai, Bangalore.
Experience : 13 - 15 Years.
Job Description :
We are seeking an experienced and highly skilled Data Architect to join our dynamic team.
The ideal candidate will have strong technical expertise in data engineering, cloud technologies, and modern data management practices.
As a Senior Data Engineer, you will play a critical role in designing, developing, and managing our data infrastructure, ensuring high performance and security across our data warehouse and analytics systems.
Key Responsibilities :
- Data Warehousing & Snowflake : Design, develop, and optimize data models within Snowflake, ensuring efficient data storage, retrieval, and performance.
- Implement best practices for managing data pipelines and optimizing Snowflake performance.
- SQL & DBT: Write and optimize complex SQL queries for data analysis and reporting.
- Utilize DBT for data transformation and orchestration to create reusable and modular workflows.
- Data Modeling : Develop and implement high-quality data models that support business intelligence, reporting, and analytics, ensuring scalability and efficiency.
- Python & Shell Scripting : Use Python for automation, data manipulation, and integration tasks.
- Leverage Shell scripting for system administration and task automation in data workflows.
- Cloud Technologies (Azure) : Leverage Azure Data Components such as Azure Data Lake, Azure Synapse, and other Azure-based tools for building, managing, and scaling data solutions.
- Data Security & Cost Management : Implement data security policies, including data masking and encryption, to safeguard sensitive information.
Optimize data warehousing costs through proper resource management and cost controls.
Orchestration & Automation : Utilize an orchestration tool like Airflow, Control-M, or Tidel to automate and schedule complex data workflows, ensuring timely and efficient data processing.
CI/CD Pipeline : Collaborate with development teams to integrate data engineering pipelines into CI/CD workflows, ensuring seamless code deployment and version control.
Change Data Capture (CDC) : Implement Change Data Capture (CDC) processes to track and capture incremental changes in the data, enabling real-time data processing and analytics.
Collaboration & Communication : Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders.
Communicate technical information in a clear, effective manner.
Required Skills :
- Expert-level Snowflake knowledge : Deep experience working with Snowflake for data warehousing, optimization, and performance tuning.
- Expert-level SQL : Extensive experience writing complex SQL queries, optimizing for performance, and building data transformation processes.
- DBT : Practical experience using DBT for managing data transformation and orchestration in the modern data stack.
- Python (Basic to Mid-level): Experience with Python for automation, scripting, and data manipulation.
- Data Modeling : Strong background in designing and implementing effective and scalable data models for reporting and analytics.
- Shell Scripting : Proficiency in Shell scripting to automate tasks and manage system-level data processes.
- Azure Data Components : Experience working with Azure Data Lake, Azure Synapse, and other Azure-based data services.
- Data Security & Masking : Hands-on experience with data security policies, data masking techniques, and implementing secure data practices.
- Orchestration Tools : Experience with at least one orchestration tool (Airflow, Control-M, or Tidel) to automate and schedule data workflows.
- CI/CD Pipeline : Experience in working with CI/CD pipelines to automate the deployment of data-related code and processes.
- Change Data Capture (CDC) : Experience with CDC for real-time data processing and incremental data tracking.
Functional Areas: Software/Testing/Networking
Read full job description