21 S P A Enterprise Info Services (India) Pvt. Ltd. Jobs
Data Engineer - Python/SQL (1-2 yrs)
S P A Enterprise Info Services (India) Pvt. Ltd.
posted 50min ago
Flexible timing
Key skills for the job
Job Title : Data Engineer (Permanent).
Location : Remote.
Job Description :
Required Skills :
- Data Engineer with a minimum of 3+ years of experience of data engineering experience.
- The role will require deep knowledge of data engineering techniques to create data pipelines and build data assets.
- At least 4+ years of Strong hands-on programming experience with PySpark / Python / Boto3 including Python Frameworks, libraries according to python best practices.
- Strong experience in code optimisation using spark SQL and PySpark.
- Understanding of Code versioning, Git repository, JFrog Artifactory.
- AWS Architecture knowledge specially on S3, EC2, Lambda, Redshift, CloudFormation etc and able to explain benefits of each.
- Code Refactorization of Legacy Codebase : Clean, modernize, improve readability and maintainability.
- Unit Tests/TDD : Write tests before code, ensure functionality, catch bugs early.
- Fixing Difficult Bugs : Debug complex code, isolate issues, resolve performance, concurrency, or logic flaws.
Responsibilities :
- Design, develop, and maintain data pipelines using PySpark, Python, and AWS services (S3, EC2, Lambda, Redshift, CloudFormation, etc.).
- Optimize data processing performance using Spark SQL and PySpark techniques.
- Implement and maintain code versioning and artifact management using Git and JFrog Artifactory.
- Refactor legacy codebases to improve code quality, readability, and maintainability.
- Write unit tests and conduct test-driven development (TDD) to ensure code quality and early bug detection.
- Debug and resolve complex technical issues related to data processing, performance, and concurrency.
- Collaborate effectively with cross-functional teams (data scientists, analysts, engineers) to understand business requirements and translate them into technical solutions.
- Stay abreast of the latest advancements in data engineering technologies and best practices.
Qualifications :
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 3+ years of experience as a Data Engineer.
- 4+ years of strong hands-on programming experience with PySpark, Python, and Boto3.
- Proven experience in optimizing data processing performance using Spark SQL and PySpark.
- Strong understanding of AWS architecture and services, including S3, EC2, Lambda, Redshift, CloudFormation, etc.
- Experience with code versioning tools (Git) and artifact repositories (JFrog Artifactory).
- Experience with code refactoring and improving code quality.
- Strong understanding of unit testing methodologies and TDD practices.
- Excellent problem-solving, debugging, and analytical skills.
- Strong communication and collaboration skills.
Bonus Points :
- Experience with data warehousing and data lake technologies.
- Experience with containerization technologies (Docker, Kubernetes).
- Experience with data quality and data lineage tools.
- Experience with Agile development methodologies.
Functional Areas: Software/Testing/Networking
Read full job description10-14 Yrs
New Delhi, Hyderabad / Secunderabad, Bangalore / Bengaluru
9-14 Yrs
Hyderabad / Secunderabad, Chennai, Bangalore / Bengaluru