859 Piktorlabs Jobs
Lead I - Data Engineering
Piktorlabs
posted 1hr ago
Flexible timing
Key skills for the job
This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions.
Outcomes:
Measures of Outcomes:
Outputs Expected:
Code Development:
Documentation:
Configuration:
Testing:
Domain Relevance:
Project Management:
Defect Management:
Estimation:
Knowledge Management:
Release Management:
Design Contribution:
Customer Interface:
Team Management:
Certifications:
Skill Examples:
Knowledge Examples:
Knowledge Examples
Job Description: Senior Developer Responsibilities: Professional with strong debugging skills to efficiently identify, troubleshoot, and resolve issues in software applications. Design, deploy, and manage scalable, secure, and reliable cloud infrastructure on AWS. Implement and manage continuous integration and continuous deployment (CI/CD) pipelines. Collaborate with cross-functional teams to define, design, and ship new features. Conduct code reviews and provide constructive feedback to peers. Expertise in writing clear and comprehensive technical documentation, user manuals, and other project-related documentation. Certificate management involves handling digital certificates for secure communication over networks. Expertise with SSL/TLS certificates, Public Key Infrastructure (PKI), and ensuring the security of communication channels. Ensure compliance with security policies and best practices. Stay up to date with the latest industry trends and technologies. Requirements: Bachelor s degree in computer science, Engineering, or a related field. Expert in C programming, and performance-critical applications. Proficiency in one or more programming languages (e. g. , C++, Java, Python, Scala). Strong knowledge of AWS services (e. g. , EC2, S3, RDS, Lambda, VPC etc. ). Highly skilled in scripting languages such as Python and Bash. Familiarity with CI/CD tools (e. g. , Jenkins, GitLab CI). Experience in developing large-scale database applications using MySQL, Oracle, or DB2 Strong problem-solving skills and the ability to think critically and creatively. Preferred Qualifications: Experience with infrastructure as code (IaC) tools (e. g. , Terraform, CloudFormation). Knowledge of containerization and orchestration tools (e. g. , Docker, Kubernetes). Experience with monitoring and logging tools (e. g. , CloudWatch, Dynatrace, New Relic, ELK Stack). Experience with Big Data solutions (e. g. , Hadoop, Spark).
Employment Type: Full Time, Permanent
Read full job descriptionPrepare for Data Engineer roles with real interview advice