Filter interviews by
Various optimization techniques used in the project include indexing, query optimization, caching, and parallel processing.
Indexing: Creating indexes on frequently queried columns to improve search performance.
Query optimization: Rewriting queries to make them more efficient and reduce execution time.
Caching: Storing frequently accessed data in memory to reduce the need for repeated database queries.
Parallel processing...
Top trending discussions
I applied via Recruitment Consulltant and was interviewed before Jul 2021. There were 2 interview rounds.
I appeared for an interview in Apr 2025, where I was asked the following questions.
Data analytics involves examining datasets to draw conclusions, identify trends, and support decision-making.
Data Collection: Gathering data from various sources, e.g., surveys, databases.
Data Cleaning: Removing inaccuracies and inconsistencies, e.g., correcting typos in a dataset.
Data Analysis: Applying statistical methods to interpret data, e.g., using regression analysis to predict sales.
Data Visualization: Creating...
A program for data analysis involves collecting, processing, and interpreting data to derive insights.
Define the objective: Understand what questions you want to answer with the data.
Collect data: Gather data from various sources like databases, APIs, or surveys.
Clean the data: Remove duplicates, handle missing values, and correct inconsistencies.
Analyze the data: Use statistical methods or data visualization tools to ...
I appeared for an interview in Mar 2025, where I was asked the following questions.
I have strong proficiency in Python, utilizing it for data analysis, visualization, and automation tasks.
Experienced in using libraries like Pandas for data manipulation and analysis.
Proficient in NumPy for numerical computations and handling large datasets.
Skilled in Matplotlib and Seaborn for data visualization to present insights effectively.
Utilized Python for automating data collection processes using web scraping...
I have extensive experience with Tableau, creating interactive dashboards and visualizations to drive data-driven decisions.
Proficient in creating complex dashboards that visualize key performance indicators (KPIs) for business insights.
Experienced in data blending and joining multiple data sources to create comprehensive reports.
Skilled in using Tableau's calculated fields to derive new metrics, such as customer lifet...
I applied via AmbitionBox and was interviewed in Nov 2022. There were 2 interview rounds.
Writing SQL queries in Python allows for seamless integration of data analysis and manipulation.
Use libraries like pandas or sqlalchemy to execute SQL queries in Python
Remember to establish a connection to the database before executing queries
Utilize Python's flexibility to manipulate and analyze data retrieved from SQL queries
I applied via Campus Placement and was interviewed before Oct 2023. There were 2 interview rounds.
Easy Aptitude and SQL questions
I appeared for an interview in Jan 2025.
In my previous role, I was responsible for analyzing data, creating reports, and providing insights to support decision-making.
Performed data cleaning, transformation, and analysis using tools like Excel, SQL, and Python.
Generated reports and dashboards to visualize data trends and patterns.
Collaborated with cross-functional teams to identify key metrics and KPIs for business performance.
Presented findings and recommen...
I was responsible for data cleaning, transformation, and visualization to create interactive dashboards for stakeholders.
Performed data cleaning and transformation to ensure accuracy and consistency of data
Utilized tools like SQL, Python, and Tableau to manipulate and visualize data
Collaborated with stakeholders to understand their requirements and design interactive dashboards
Generated insights from data analysis to d...
I optimize and manage large datasets by using tools like SQL, Python, and Excel. I utilize various data sources such as databases, APIs, and web scraping.
Utilize SQL queries to efficiently extract and manipulate data from databases
Use Python for data cleaning, analysis, and visualization tasks
Leverage Excel for organizing and summarizing data in a user-friendly format
Obtain data from sources like internal databases, ex...
There are approximately 500-600 traffic signals in Noida.
Estimate based on the size and population of Noida
Consider the number of major intersections and roads in the city
Take into account the traffic density and need for traffic management
Consult local traffic authorities for more accurate data
I applied via Approached by Company and was interviewed before May 2023. There was 1 interview round.
SQL query to find the top 5 employee having the max salary
based on 1 interview experience
Difficulty level
Duration
based on 2 reviews
Rating in categories
Network Engineer
1.2k
salaries
| ₹2 L/yr - ₹6 L/yr |
Senior Network Engineer
572
salaries
| ₹2.5 L/yr - ₹6 L/yr |
Team Lead
321
salaries
| ₹3.5 L/yr - ₹8 L/yr |
Senior Executive
297
salaries
| ₹2.1 L/yr - ₹6 L/yr |
Sales Executive
226
salaries
| ₹2 L/yr - ₹4.7 L/yr |
Vodafone Idea
Bharti Airtel
Ericsson
Tata Communications