Upload Button Icon Add office photos
filter salaries All Filters

658 Piktorlabs Jobs

Lead I - Data Engineering

5-7 years

Pune

1 vacancy

Lead I - Data Engineering

Piktorlabs

posted 13hr ago

Job Description

This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions.
Outcomes:
  1. Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications.
  2. Document and communicate milestones/stages for end-to-end delivery.
  3. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality.
  4. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency.
  5. Validate results with user representatives integrating the overall solution seamlessly.
  6. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes.
  7. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools.
  8. Influence and improve customer satisfaction through effective data solutions.
Measures of Outcomes:
  1. Adherence to engineering processes and standards
  2. Adherence to schedule / timelines
  3. Adhere to SLAs where applicable
  4. # of defects post delivery
  5. # of non-compliance issues
  6. Reduction of reoccurrence of known defects
  7. Quickly turnaround production bugs
  8. Completion of applicable technical/domain certifications
  9. Completion of all mandatory training requirements
  10. Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times).
  11. Average time to detect respond to and resolve pipeline failures or data issues.
  12. Number of data security incidents or compliance breaches.
Outputs Expected:
Code Development:
  1. Develop data processing code independently
    ensuring it meets performance and scalability requirements.
  2. Define coding standards
    templates
    and checklists.
  3. Review code for team members and peers.

Documentation:
  1. Create and review templates
    checklists
    guidelines
    and standards for design
    processes
    and development.
  2. Create and review deliverable documents
    including design documents
    architecture documents
    infrastructure costing
    business requirements
    source-target mappings
    test cases
    and results.

Configuration:
  1. Define and govern the configuration management plan.
  2. Ensure compliance within the team.

Testing:
  1. Review and create unit test cases
    scenarios
    and execution plans.
  2. Review the test plan and test strategy developed by the testing team.
  3. Provide clarifications and support to the testing team as needed.

Domain Relevance:
  1. Advise data engineers on the design and development of features and components
    demonstrating a deeper understanding of business needs.
  2. Learn about customer domains to identify opportunities for value addition.
  3. Complete relevant domain certifications to enhance expertise.

Project Management:
  1. Manage the delivery of modules effectively.

Defect Management:
  1. Perform root cause analysis (RCA) and mitigation of defects.
  2. Identify defect trends and take proactive measures to improve quality.

Estimation:
  1. Create and provide input for effort and size estimation for projects.

Knowledge Management:
  1. Consume and contribute to project-related documents
    SharePoint
    libraries
    and client universities.
  2. Review reusable documents created by the team.

Release Management:
  1. Execute and monitor the release process to ensure smooth transitions.

Design Contribution:
  1. Contribute to the creation of high-level design (HLD)
    low-level design (LLD)
    and system architecture for applications
    business components
    and data models.

Customer Interface:
  1. Clarify requirements and provide guidance to the development team.
  2. Present design options to customers and conduct product demonstrations.

Team Management:
  1. Set FAST goals and provide constructive feedback.
  2. Understand team members aspirations and provide guidance and opportunities for growth.
  3. Ensure team engagement in projects and initiatives.

Certifications:
  1. Obtain relevant domain and technology certifications to stay competitive and informed.
Skill Examples:
  1. Proficiency in SQL Python or other programming languages used for data manipulation.
  2. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
  3. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
  4. Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
  5. Experience in performance tuning of data processes.
  6. Expertise in designing and optimizing data warehouses for cost efficiency.
  7. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets.
  8. Capacity to clearly explain and communicate design and development aspects to customers.
  9. Ability to estimate time and resource requirements for developing and debugging features or components.
Knowledge Examples:
Knowledge Examples
  1. Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF.
  2. Proficiency in SQL for analytics including windowing functions.
  3. Understanding of data schemas and models relevant to various business contexts.
  4. Familiarity with domain-related data and its implications.
  5. Expertise in data warehousing optimization techniques.
  6. Knowledge of data security concepts and best practices.
  7. Familiarity with design patterns and frameworks in data engineering.
Seeking a Big Data Engineer with 4-6 years of experience to play a key role in building, managing, and evolving our big data pipelines. Youll leverage your data engineering expertise to not only onboard customers but also contribute to critical initiatives that advance our technology stack. This role offers the opportunity to directly improve product capabilities while expanding your skill set with cutting-edge technologies. Responsibilities:- Collaborate closely with Technical Leaders to devise and build the right solutions. Participate in design discussions and brainstorming sessions to select, integrate, and maintain tools and frameworks required to solve Big Data problems. Implement/Maintain systems to cleanse, process, and analyze large data sets using distributed processing tools like Airflow and Spark. Ability to learn quickly and adapt to new technologies Experience working with software and support teams in multiple regions Contribute insights and suggestions as part of our continuous improvement. Ability to work effectively in a fast-paced, collaborative environment. Required Qualifications 4 to 6 years of experience in Data engineering. Strong computer science background and knowledge of software and product development methodologies. In-depth understanding of the Big Data ecosystem including processing frameworks like Spark, Hadoop. and the file types they deal with. Experience with ETL and Data pipeline orchestration tools like Apache Airflow, dbt, etc. Excellent coding skills in Python, Java or Scala, SQL. Experience with Git and build tools like Gradle/Maven/SBT. Experience/Understanding with data warehouse platforms. Experience working on cloud platforms (like AWS,GCP,Azure) Strong understanding of object-oriented design, data structures, algorithms, profiling, and optimization. Excellent communication and collaboration skills

Employment Type: Full Time, Permanent

Read full job description

Prepare for Data Engineer roles with real interview advice

What people at Piktorlabs are saying

What Piktorlabs employees are saying about work life

based on 12 employees
73%
100%
90%
100%
Flexible timing
Monday to Friday
No travel
Day Shift
View more insights

Piktorlabs Benefits

Work From Home
Team Outings
Health Insurance
Free Transport
Child care
Gymnasium +6 more
View more benefits

Compare Piktorlabs with

TCS

3.7
Compare

Accenture

3.9
Compare

Wipro

3.7
Compare

Cognizant

3.8
Compare

Capgemini

3.8
Compare

HDFC Bank

3.9
Compare

ICICI Bank

4.0
Compare

Infosys

3.7
Compare

HCLTech

3.5
Compare

Tech Mahindra

3.6
Compare

Genpact

3.9
Compare

Teleperformance

3.9
Compare

Concentrix Corporation

3.8
Compare

Axis Bank

3.8
Compare

Amazon

4.1
Compare

Jio

3.9
Compare

Reliance Retail

3.9
Compare

IBM

4.1
Compare

iEnergizer

4.7
Compare

LTIMindtree

3.9
Compare

Similar Jobs for you

Data Engineer at UST

Thiruvananthapuram

7-9 Yrs

₹ 9-11 LPA

Data Engineer at UST

Thiruvananthapuram

5-7 Yrs

₹ 7-9 LPA

Data Engineer at UST

Chennai

7-9 Yrs

₹ 9-11 LPA

Associate 3 at UST

Pune

3-5 Yrs

₹ 5-7 LPA

Software Engineer at UST

Bangalore / Bengaluru

5-7 Yrs

₹ 7-9 LPA

Associate 3 at UST

Thiruvananthapuram

3-5 Yrs

₹ 5-7 LPA

Software Engineer at UST

Thiruvananthapuram

5-7 Yrs

₹ 7-9 LPA

Software Engineer at UST

Thiruvananthapuram

5-7 Yrs

₹ 7-9 LPA

Software Engineer at UST

Chennai

5-7 Yrs

₹ 7-9 LPA

Software Engineer at UST

Bangalore / Bengaluru

5-7 Yrs

₹ 7-9 LPA

Lead I - Data Engineering

5-7 Yrs

Pune

1d ago·via naukri.com

Lead I - Business Analysis - Guidewire PC BA

5-7 Yrs

Bangalore / Bengaluru

1d ago·via naukri.com

Lead I - Business Analysis - BA

5-7 Yrs

Thiruvananthapuram

1d ago·via naukri.com

Lead I - Business Analysis - Guidewire Insurance BA

5-7 Yrs

Bangalore / Bengaluru

1d ago·via naukri.com

Progress 4GL - Software Engineering

5-7 Yrs

Bangalore / Bengaluru

1d ago·via naukri.com

Lead I - Software Engineering - React

5-7 Yrs

Hyderabad / Secunderabad

1d ago·via naukri.com

Lead I - DevOps Engineering

5-7 Yrs

Chennai

1d ago·via naukri.com

Associate II - VLSI AL CAD

2-3 Yrs

Bangalore / Bengaluru

1d ago·via naukri.com

Specialist II - DevOps Engineering - Docker

12-15 Yrs

Thiruvananthapuram

1d ago·via naukri.com

Lead II - Enterprise Solutions

7-9 Yrs

Thiruvananthapuram

1d ago·via naukri.com
write
Share an Interview