1 Proeffective IT Services Job
QA Engineer
Proeffective IT Services
posted 19d ago
Key skills for the job
Job Title: QA Engineer - Data Engineering
Location: Hyderabad
Department: Data Engineering
Reports To: QA Lead / Data Engineering Lead
Type: Full-time
Job Summary
We are looking for a Data Engineering QA Engineer who will be responsible for testing,
validating, and ensuring the quality of our data pipelines, data transformations, and analytics
platforms. The role involves creating test strategies, designing test cases, and working closely
with Data Engineers to ensure the accuracy, integrity, and performance of our data solutions.
Key Responsibilities:
Data Pipeline Testing: Test and validate data pipelines (ETL/ELT processes) to
ensure accurate data movement, transformation, and integration across different
platforms.
Data Quality Assurance: Define and implement data quality checks, perform
exploratory data testing, and monitor data for accuracy and consistency.
Test Automation: Design and implement automated testing strategies for data
validation using frameworks/tools like PyTest, SQL queries, or custom scripts.
Collaboration: Work closely with Data Engineers, Data Analysts, and Product
Managers to understand requirements and deliver test plans and strategies aligned
with data engineering processes.
Performance Testing: Analyze and test the performance and scalability of largescale data solutions to ensure they meet business requirements.
Defect Management: Identify, track, and resolve data quality issues and bugs,
working with teams to ensure timely resolution.
Compliance: Ensure that data engineering solutions comply with data governance,
privacy, and security standards.
Reporting: Generate testing reports and provide insights into data quality and system
performance.
Required Skills & Experience:
Proven Experience: 3-5 years of experience as a QA Engineer, Data Engineer, or
similar role in data-focused environments.
Strong SQL Skills: Proficiency in writing complex SQL queries to validate and test
data.
ETL/ELT Experience: Familiarity with ETL/ELT tools and processes like DBT,
Apache Airflow, Talend, Informatica, etc.
Automation Frameworks: Experience with test automation frameworks and tools
such as PyTest, Robot Framework, or similar.
Cloud Platforms: Knowledge of cloud services (AWS, GCP, Azure) and tools like
Redshift, BigQuery, Snowflake, or Databricks.
Programming: Strong scripting and programming skills in Python, Java, or a similar
language.
Data Warehousing: Understanding of data warehousing concepts and best practices
for data validation.
Version Control: Experience using version control tools (e.g., Git) for code and
testing artifacts.
Agile Environment: Experience working in Agile/Scrum teams and knowledge of
CI/CD pipelines.
Attention to Detail: Meticulous when it comes to data validation, ensuring data
accuracy and quality at every step.
Nice to Have:
Big Data Experience: Exposure to big data tools such as Hadoop, Spark, or Kafka.
Data Governance & Compliance: Familiarity with GDPR, CCPA, or other data
privacy regulations.
BI Tools: Experience working with BI tools like Tableau, PowerBI, or Looker.
Certification: AWS/GCP Data Engineering or QA certifications.
Education:
Bachelor's degree in Computer Science, Information Systems, Data Science, or a
related field.
Employment Type: Full Time, Permanent
Read full job description