Upload Button Icon Add office photos
Premium Employer

i

This company page is being actively managed by UST Team. If you also belong to the team, you can get access from here

UST Verified Tick

Compare button icon Compare button icon Compare
filter salaries All Filters

997 UST Jobs

Lead II - Data Engineering

7-9 years

Chennai

1 vacancy

Lead II - Data Engineering

UST

posted 3mon ago

Job Description

This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.
Outcomes:
  1. Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others.
  2. Interpret requirements create optimal architecture and design solutions in accordance with specifications.
  3. Document and communicate milestones/stages for end-to-end delivery.
  4. Code using best standards debug and test solutions to ensure best-in-class quality.
  5. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure.
  6. Create data schemas and models effectively.
  7. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes.
  8. Validate results with user representatives integrating the overall solution.
  9. Influence and enhance customer satisfaction and employee engagement within project teams.
Measures of Outcomes:
  1. TeamOnes Adherence to engineering processes and standards
  2. TeamOnes Adherence to schedule / timelines
  3. TeamOnes Adhere to SLAs where applicable
  4. TeamOnes # of defects post delivery
  5. TeamOnes # of non-compliance issues
  6. TeamOnes Reduction of reoccurrence of known defects
  7. TeamOnes Quickly turnaround production bugs
  8. Completion of applicable technical/domain certifications
  9. Completion of all mandatory training requirements
  10. Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times).
  11. TeamOnes Average time to detect respond to and resolve pipeline failures or data issues.
  12. TeamOnes Number of data security incidents or compliance breaches.
Outputs Expected:
Code:
  1. Develop data processing code with guidance
    ensuring performance and scalability requirements are met.
  2. Define coding standards
    templates
    and checklists.
  3. Review code for team and peers.

Documentation:
  1. Create/review templates
    checklists
    guidelines
    and standards for design/process/development.
  2. Create/review deliverable documents
    including design documents
    architecture documents
    infra costing
    business requirements
    source-target mappings
    test cases
    and results.

Configure:
  1. Define and govern the configuration management plan.
  2. Ensure compliance from the team.

Test:
  1. Review/create unit test cases
    scenarios
    and execution.
  2. Review test plans and strategies created by the testing team.
  3. Provide clarifications to the testing team.

Domain Relevance:
  1. Advise data engineers on the design and development of features and components
    leveraging a deeper understanding of business needs.
  2. Learn more about the customer domain and identify opportunities to add value.
  3. Complete relevant domain certifications.

Manage Project:
  1. Support the Project Manager with project inputs.
  2. Provide inputs on project plans or sprints as needed.
  3. Manage the delivery of modules.

Manage Defects:
  1. Perform defect root cause analysis (RCA) and mitigation.
  2. Identify defect trends and implement proactive measures to improve quality.

Estimate:
  1. Create and provide input for effort and size estimation
    and plan resources for projects.

Manage Knowledge:
  1. Consume and contribute to project-related documents
    SharePoint
    libraries
    and client universities.
  2. Review reusable documents created by the team.

Release:
  1. Execute and monitor the release process.

Design:
  1. Contribute to the creation of design (HLD
    LLD
    SAD)/architecture for applications
    business components
    and data models.

Interface with Customer:
  1. Clarify requirements and provide guidance to the Development Team.
  2. Present design options to customers.
  3. Conduct product demos.
  4. Collaborate closely with customer architects to finalize designs.

Manage Team:
  1. Set FAST goals and provide feedback.
  2. Understand team members aspirations and provide guidance and opportunities.
  3. Ensure team members are upskilled.
  4. Engage the team in projects.
  5. Proactively identify attrition risks and collaborate with BSE on retention measures.

Certifications:
  1. Obtain relevant domain and technology certifications.
Skill Examples:
  1. Proficiency in SQL Python or other programming languages used for data manipulation.
  2. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
  3. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
  4. Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
  5. Experience in performance tuning.
  6. Experience in data warehouse design and cost improvements.
  7. Apply and optimize data models for efficient storage retrieval and processing of large datasets.
  8. Communicate and explain design/development aspects to customers.
  9. Estimate time and resource requirements for developing/debugging features/components.
  10. Participate in RFP responses and solutioning.
  11. Mentor team members and guide them in relevant upskilling and certification.
Knowledge Examples:
Knowledge Examples
  1. Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.
  2. Proficient in SQL for analytics and windowing functions.
  3. Understanding of data schemas and models.
  4. Familiarity with domain-related data.
  5. Knowledge of data warehouse optimization techniques.
  6. Understanding of data security concepts.
  7. Awareness of patterns frameworks and automation practices.
Job Summary As a Senior Data Engineer, you will be responsible for designing, building, and maintaining scalable data pipelines on an AWS cloud platform to support our data-driven initiatives. Your expertise in ETL data ingestion frameworks/tools will play a critical role in ensuring efficient data processing and integration. Accountabilities Create and maintain data ingestion pipelines, models, and architectures required to support a growing Data Marketing business Work with Product Management, business partners, and the Data Science team members to understand and create solutions to meet their needs Work with the Quality Engineers to validate solutions are meetings requirements. Implement automation processes as the opportunities present. Basic Qualifications: Familiarity with Data Pipeline Management Frameworks on Cloud (AWS Preferred, Azure, Google): As a Senior Data Engineer, you should have a strong understanding of data pipeline management frameworks offered by major cloud providers like AWS, Azure, and Google. Your expertise in working with these platforms will enable you to design and implement robust data pipelines to extract, transform, and load data from various sources. Familiarity with ETL Data Ingestion Framework/Tools. You should be well-versed in ETL (Extract, Transform, Load) data ingestion frameworks/tools, such as Azure Data Factory, Google Data Fusion, and SSIS. Your knowledge of these tools will facilitate seamless data integration and ensure data quality throughout the pipeline. Hands-on Experience with Python: Proficiency in Python is essential for this role. You should have hands-on experience using Python to develop data processing scripts, data manipulation, and transformation tasks, as well as implementing data engineering solutions. Knowledge of Source Control and Scrum Agile Software Development Methodologies: A strong foundation in source control practices, such as Bit Bucket, is required. Moreover, you should be familiar with Scrum Agile software development methodologies to effectively collaborate with cross-functional teams and deliver high-quality data engineering solutions. Familiarity with AWS Ecosystem: Having a deep understanding of the AWS ecosystem, including training jobs, processing jobs, and Sagemaker, will be a significant advantage. This knowledge will allow you to leverage AWS services efficiently and optimize data workflows. Preferred Qualifications: Experience in large-data solutions is highly desirable. Excellent verbal, written, and interpersonal communication skills. Experience with Scikit-learn, PyTorch, and Huggingface, and Building Transformer and Sentence Transformer Models: Your expertise in working with popular machine learning libraries like Scikit-learn, PyTorch, and Huggingface will be critical for developing and deploying transformer and sentence transformer models. Experience in building and fine-tuning these models will further enhance your role as a Senior Data Engineer.

Employment Type: Full Time, Permanent

Read full job description

UST Interview Questions & Tips

Prepare for UST Data Engineer roles with real interview advice

Top UST Data Engineer Interview Questions

Q1. Nth Prime Number Problem Statement Find the Nth prime number given a number N. Explanation: A prime number is greater than 1 and is not the ... read more
Q2. LRU Cache Design Problem Statement Design and implement a data structure for a Least Recently Used (LRU) cache that supports the following ... read more
Q3. String Compression Task Develop an algorithm that compresses a string by replacing consecutive duplicate characters with the character foll ... read more
View all 339 questions

What people at UST are saying

3.7
 Rating based on 13 Data Engineer reviews

Likes

Great work culture

  • Job security - Good
  • +3 more
Dislikes

Learning curve is less

Read 13 Data Engineer reviews

Data Engineer salary at UST

reported by 127 employees with 2-6 years exp.
â‚ą4 L/yr - â‚ą16 L/yr
16% less than the average Data Engineer Salary in India
View more details

What UST employees are saying about work life

based on 4.5k employees
77%
89%
69%
90%
Flexible timing
Monday to Friday
No travel
Day Shift
View more insights

UST Benefits

Work From Home
Health Insurance
Cafeteria
Soft Skill Training
Team Outings
Job Training +6 more
View more benefits

Compare UST with

Accenture

3.8
Compare

Wipro

3.7
Compare

Cognizant

3.7
Compare

Capgemini

3.7
Compare

Genpact

3.8
Compare

IBM

4.0
Compare

DXC Technology

3.7
Compare

Sutherland Global Services

3.6
Compare

Optum Global Solutions

4.0
Compare

FIS

3.9
Compare

Virtusa Consulting Services

3.8
Compare

CGI Group

4.0
Compare

GlobalLogic

3.6
Compare

Bosch Global Software Technologies

3.9
Compare

Eviden

3.6
Compare

Atos

3.8
Compare

Nagarro

4.0
Compare

NTT Data

3.8
Compare

Hewlett Packard Enterprise

4.2
Compare

Publicis Sapient

3.5
Compare

Similar Jobs for you

Data Engineer at UST

Pune

5-7 Yrs

â‚ą 7-9 LPA

Data Engineer at UST

Pune

5-7 Yrs

â‚ą 7-9 LPA

Data Engineer at UST

Thiruvananthapuram

5-7 Yrs

â‚ą 7-9 LPA

Data Engineer at UST

Thiruvananthapuram

5-7 Yrs

â‚ą 7-9 LPA

Data Engineer at Swathi Business Solutions

Remote

8-10 Yrs

â‚ą 6-10 LPA

Software Engineer at UST

Thiruvananthapuram

7-9 Yrs

â‚ą 9-11 LPA

Lead at Tayana Software Solutions Pvt Ltd

Bangalore / Bengaluru

6-8 Yrs

â‚ą 10-15 LPA

Senior Data Analyst at Exponentia Team

Ahmedabad

7-12 Yrs

â‚ą 5-10 LPA

SQL Database Developer at Nuware Systems Pvt Ltd

Bangalore / Bengaluru

4-9 Yrs

â‚ą 7-11 LPA

Senior Technical Consultant at o9 SOLUTIONS, INC.

Bangalore / Bengaluru

1-6 Yrs

â‚ą 7-11 LPA

Lead II - Data Engineering

7-9 Yrs

Chennai

3mon ago·via naukri.com

L1 SOC Analyst - Splunk

2-5 Yrs

Thiruvananthapuram

4d ago·via naukri.com

Lead I - Business Analysis

5-7 Yrs

Hyderabad / Secunderabad

4d ago·via naukri.com

Lead II - Enterprise Solutions

7-9 Yrs

Bangalore / Bengaluru

4d ago·via naukri.com

Developer III - Software Engineering - Dot Net Fullstack Developer

3-5 Yrs

Thiruvananthapuram

4d ago·via naukri.com

Lead I - Enterprise Solutions

5-7 Yrs

Bangalore / Bengaluru

4d ago·via naukri.com

Lead I - Production Support

5-7 Yrs

Pune

4d ago·via naukri.com

Lead I - Software Testing - Automation QA

5-7 Yrs

Bangalore / Bengaluru

4d ago·via naukri.com

Lead II - Cloud Infrastructure Services

7-9 Yrs

Bangalore / Bengaluru

4d ago·via naukri.com

Lead II - Software Engineering

7-9 Yrs

Hyderabad / Secunderabad

4d ago·via naukri.com
write
Share an Interview