i
Axtria
Filter interviews by
Building a data pipeline involves extracting, transforming, and loading data from various sources to a destination for analysis.
Identify data sources and determine the data to be collected
Extract data from sources using tools like Apache NiFi or Apache Kafka
Transform data using tools like Apache Spark or Python scripts
Load data into a destination such as a data warehouse or database
Schedule and automate the pipeline fo...
Top trending discussions
I applied via Naukri.com and was interviewed in Jun 2021. There were 4 interview rounds.
Object-oriented programming (OOP) knowledge is an advantage but not necessary for a data engineer.
OOP concepts like inheritance, encapsulation, and polymorphism can be useful in designing data models.
OOP languages like Java and Python are commonly used in data engineering.
Understanding OOP can help with debugging and maintaining code.
However, OOP is not a requirement for data engineering and other programming paradigms...
I applied via Naukri.com and was interviewed in Dec 2024. There was 1 interview round.
I applied via Naukri.com and was interviewed before Dec 2023. There were 2 interview rounds.
As a Data Analyst, my roles and responsibilities include analyzing data, creating reports, identifying trends, and providing insights to support decision-making.
Analyzing large datasets to extract meaningful insights
Creating reports and visualizations to communicate findings
Identifying trends and patterns in data
Collaborating with stakeholders to understand business needs
Providing recommendations based on data analysis
...
In the next 4 years, I see myself advancing my skills in data analysis, taking on more responsibilities, and potentially moving into a leadership role.
Continuing to improve my data analysis skills through training and hands-on experience
Taking on more complex projects and responsibilities within the data analysis field
Possibly transitioning into a leadership role where I can mentor and guide junior analysts
Exploring op...
I am a data analyst with a background in statistics and programming, passionate about uncovering insights from data.
Background in statistics and programming
Experience in data analysis and visualization tools like Python, R, and Tableau
Strong problem-solving skills
Ability to communicate complex data findings to non-technical stakeholders
I have 5 years of experience working as a Data Analyst in various industries.
5 years of experience as a Data Analyst
Worked in multiple industries such as finance, healthcare, and technology
Proficient in data analysis tools and techniques
Experience in handling large datasets and generating insights
Strong problem-solving and communication skills
Highly interactive and technical
1 hr test with focus on math and analytical thinking
I appeared for an interview before May 2024, where I was asked the following questions.
GCP enhances Apache Airflow for job scheduling with scalability, integration, and managed services for data workflows.
Scalability: GCP's infrastructure allows Airflow to scale dynamically based on workload, ensuring efficient resource utilization.
Integration with GCP Services: Airflow can easily integrate with GCP services like BigQuery, Cloud Storage, and Dataflow for seamless data processing.
Managed Airflow: Google C...
Teradata is a data warehousing solution, while stored procedures are precompiled SQL code for efficient database operations.
Data Warehousing: Teradata is designed for large-scale data warehousing, enabling organizations to analyze vast amounts of data efficiently.
Scalability: Teradata can handle massive datasets and supports parallel processing, making it suitable for enterprise-level applications.
Stored Procedures: Th...
Airflow allows job restarts through task retries, backfilling, and manual triggers, ensuring robust workflow management.
Task Retries: Airflow can automatically retry failed tasks based on defined parameters, e.g., setting 'retries' to 3 for a task.
Backfilling: Airflow supports backfilling, allowing users to run tasks for past dates if they were missed, useful for data pipelines.
Manual Triggers: Users can manually trigg...
I appeared for an interview before Mar 2024.
The group discussion (GD) round is expected to last 20 minutes. The topics were straightforward and easily comprehensible. The primary focus when participating in the GD should be on English fluency. It is not primarily about how content-rich or intellectually impressive your speech is, but rather about the level of fluency in communication.
I applied via Campus Placement and was interviewed in Dec 2021. There were 4 interview rounds.
I appeared for an interview in May 2025, where I was asked the following questions.
based on 2 interview experiences
Difficulty level
based on 5 reviews
Rating in categories
Senior Associate
1.2k
salaries
| ₹13.9 L/yr - ₹23.5 L/yr |
Project Lead
758
salaries
| ₹17 L/yr - ₹29.4 L/yr |
Associate
747
salaries
| ₹11.9 L/yr - ₹19 L/yr |
Analyst
408
salaries
| ₹9.9 L/yr - ₹18 L/yr |
Data Analyst
256
salaries
| ₹9.8 L/yr - ₹18.3 L/yr |
Thomson Reuters
HighRadius
Chetu
EbixCash Limited