Upload Button Icon Add office photos
filter salaries All Filters

13 USEReady Jobs

Data Engineer For Databricks

7-12 years

Mohali, Gurgaon / Gurugram, Bangalore / Bengaluru

Data Engineer For Databricks

USEReady

posted 13hr ago

Job Description

Overview and Position Summary

NYU Langone Health is an academic medical center located in New York City, New York, United States. The health system consists of the NYU Grossman School of Medicine and NYU Grossman Long Island School of Medicine, both part of New York University (NYU), and more than 300 locations throughout New York City and other part of the United States, including six inpatient facilities: Tisch Hospital; Kimmel Pavilion; NYU Langone Orthopedic Hospital; Hassenfeld Children's Hospital; NYU Langone Hospital – Brooklyn; and NYU Langone Hospital – Long Island. NYU Langone Health is one of the largest healthcare systems in the Northeast, with more than 49,000 employees.

 

The Enterprise Data and Analytics (EDA) department at NYU Langone Health plays a crucial role in modern healthcare organizations by leveraging data to enhance decision-making, optimize operations, and improve patient outcomes. As an Azure Databricks DevOps Administrator within this department, you will be responsible for managing and maintaining the Databricks platform, ensuring optimal performance and security across workspaces, clusters, and jobs. This role is crucial in overseeing user administration, managing Azure storage solutions, and implementing CI/CD pipelines using Azure DevOps. By leveraging a deep understanding of Databricks architecture and proficiency in scripting languages, the Azure DevOps administrator will automate tasks and enhance the efficiency of data operations. Strong communication skills and a commitment to high-quality support will enable effective collaboration with cross-functional teams, directly contributing to the department's mission of delivering robust data solutions and insights.

 

Job Responsibilities

· Hands-on experience in Azure Cloud services, Networking Concepts, Security - Cloud and on-premises system, Deployment using Azure DevOps, Azure Cloud Monitoring and Cost controls and Terraform - hands-on experience

· CI/CD Pipeline Management: Design, implement, and manage Continuous Integration/Continuous Deployment (CI/CD) pipelines using Azure DevOps and GitHub. Ensure the pipelines are efficient, reliable, and scalable.

· Infrastructure as Code (IaC): Automate the provisioning and management of infrastructure, with a focus on Azure Databricks and Azure Data Factory in a private network environment, using tools like Terraform and ARM templates.

· Environment Management: Create and manage development, testing, and production environments, ensuring consistency, security, and alignment with organizational requirements.

· Security: Implement security best practices throughout the CI/CD pipeline, including secrets management, secure code scanning, and compliance with security standards.

· Monitoring & Logging: Set up and maintain monitoring and logging for applications and infrastructure using Azure Monitor, Log Analytics, and related tools to ensure system reliability and performance.

· Automation: Identify opportunities for automation to streamline processes, reduce manual errors, and improve operational efficiency.

· Policy Enforcement: Establish and enforce policies such as branch policies, pull request reviews, and pipeline approvals to maintain code quality and compliance with organizational standards.

· Manage and maintain Azure Databricks Platform, workspaces, clusters, and jobs

· Oversee user administration including access controls and permissions

· Handle library installations, runtime management, and policy enforcement

· Implement/analyze cost control measures

· Administer Unity Catalog for data governance and security

· Collaborate with data engineers, data scientists, and analysts to optimize and streamline data workflows, and analytical pipelines on the Databricks platform

· Manage Azure storage solutions, including Blob Storage and Data Lake Storage

· Administer Azure Key Vault for secure storage of secrets and keys

· Configure and manage Azure Data Factory for data integration and ETL processes

· Implement and manage VNETs, firewalls, Azure policies and security best practices

· Set up budgets and alerts to monitor and control Azure costs and spend

· Configure alerts for proactive issue detection and resolution

· Databricks Lakehouse Monitoring

 

Minimum Qualifications

· Minimum of Bachelor's degree in Computer science, Information systems, or Engineering

· Experience: 7+ years of professional experience in DevOps, with a strong focus on Azure. Knowledge of Data Platforms like Databricks, Data Factory (preferred)

· Proficiency in scripting languages such as Python/PySpark, PowerShell, or Bash

· Experience in automating administrative tasks and workflows

· Knowledge of security best practices and compliance requirements

· Experience with ETL processes, data pipelines, and big data technologies

· Experience with backup and restore procedures for Databricks and Azure services

· Ability to troubleshoot and resolve issues in a timely and efficient manner

· Strong verbal and written communication skills

· Ability to document processes, procedures, and configurations clearly

· Commitment to providing high-quality support to internal/external users and stakeholders

· Ability to understand and address the needs of internal and external customers

· Team player with the ability to work collaboratively with cross-functional teams

· Flexibility to adapt to changing requirements and priorities

Offshore data engineer with Databricks ML (leverage Python skill to train as needed). 

· Continuous Learning: Eagerness to learn new technologies and continuously improve skills to stay current in a rapidly evolving field.

Preferred Qualifications

Offshore data engineer with Databricks ML (leverage Python skill to train as needed). 

· Deep understanding of Databricks architecture, features, and best practices

· Experience with Delta Lake and Databricks SQL for advanced data management

· Experience in performance tuning and optimization for both Databricks and Azure environments

· Experience supporting/administering data science and ML workloads in Databricks

· Knowledge of disaster recovery planning and implementation

· Experience integrating Databricks with other data platforms and tools

· Understanding of hybrid and multi-cloud environments

· Experience implementing data protection measures and ensuring regulatory compliance


Employment Type: Full Time, Permanent

Read full job description

USEReady Interview Questions & Tips

Prepare for USEReady Data Engineer roles with real interview advice

What people at USEReady are saying

Data Engineer salary at USEReady

reported by 21 employees with 2-4 years exp.
₹4 L/yr - ₹8 L/yr
41% less than the average Data Engineer Salary in India
View more details

What USEReady employees are saying about work life

based on 88 employees
90%
100%
79%
67%
Flexible timing
Monday to Friday
No travel
Night Shift
View more insights

USEReady Benefits

Health Insurance
Work From Home
Job Training
Team Outings
Soft Skill Training
Cafeteria +6 more
View more benefits

Compare USEReady with

Trianz

3.5
Compare

LatentView Analytics

3.7
Compare

Fractal Analytics

4.0
Compare

Mu Sigma

2.6
Compare

AbsolutData

3.6
Compare

Crescendo Global

4.1
Compare

Axtria

3.0
Compare

Tiger Analytics

3.7
Compare

Bridgei2i Analytics Solutions

3.8
Compare

C5i

4.1
Compare

VDart

4.5
Compare

Magic Edtech

3.1
Compare

Jumio

3.7
Compare

Saama Technologies

3.7
Compare

DISYS

3.0
Compare

Data-Core Systems

3.1
Compare

Arvato

3.6
Compare

Microsense

3.5
Compare

Xchanging

4.0
Compare

11:11 Systems

3.8
Compare

Similar Jobs for you

Data Engineer at Rubis Software Solutions

Hyderabad / Secunderabad, Pune + 1

5-8 Yrs

₹ 12-22 LPA

Senior Data Engineer at Coeo

Hyderabad / Secunderabad

5-7 Yrs

₹ 7-9 LPA

Devops at Fusion Plus Solutions Inc

Hyderabad / Secunderabad

4-8 Yrs

₹ 5-9 LPA

Devops Consultant at SAPIENS TECHNOLOGIES (1982) INDIA PRIVATE LIMITED

Bangalore / Bengaluru

7-10 Yrs

₹ 12-17 LPA

Devops Engineer at Fakeeh Technologies

Thiruvananthapuram

4-8 Yrs

₹ 6-10 LPA

Azure Data Engineer at Pricewaterhouse Coopers Private Limited

Kolkata

2-6 Yrs

₹ 11-15 LPA

Azure DevOps Engineer at Chane Leaders

6-12 Yrs

₹ 14-30 LPA

Data Engineer at cellstrat

Bangalore / Bengaluru

10-15 Yrs

₹ 10-15 LPA

Professional at Zensar Technologies Limited

Pune

7-9 Yrs

₹ 9-11 LPA

Azure DevOps Engineer at Egon Zehnder

3-6 Yrs

₹ 10-15 LPA

Data Engineer For Databricks

7-12 Yrs

Mohali, Gurgaon / Gurugram, Bangalore / Bengaluru

1d ago·via naukri.com

Databricks DevOps Architect

7-12 Yrs

Mohali, Gurgaon / Gurugram, Bangalore / Bengaluru

1d ago·via naukri.com

SAP BO/BOBJ Specialist

10-15 Yrs

Canada, United states

5d ago·via naukri.com

Senior Data Modeler

4-7 Yrs

Pune

11d ago·via naukri.com

Senior Architect - Advanced Java and Data Pipelines

9-14 Yrs

Pune

11d ago·via naukri.com

Senior Data Modeler (Snowflake and Hadoop)

8-13 Yrs

₹ 50 - 60L/yr

Pune, Gurgaon / Gurugram, Bangalore / Bengaluru

17d ago·via naukri.com

Java Full Stack Engineer

3-6 Yrs

Pune

30d ago·via naukri.com

Senior Data Engineer

2-9 Yrs

Mohali

1mon ago·via naukri.com

Data Engineer

2-5 Yrs

Mohali

3mon ago·via naukri.com
write
Share an Interview