Filter interviews by
I applied via Naukri.com and was interviewed in May 2020. There were 3 interview rounds.
PCA is a statistical technique used to reduce the dimensionality of a dataset while retaining most of the original information.
PCA is used to identify patterns in data and reduce the number of variables in a dataset.
It transforms the original variables into a new set of variables called principal components.
The first principal component explains the maximum variance in the data, followed by the second and so on.
PCA is ...
Weibull distribution is used to model the failure rate of machines and equipment over time.
Used in reliability engineering to predict the failure rate of machines and equipment over time
Commonly used in the automotive industry to predict the failure rate of car parts
Also used in the pharmaceutical industry to model the time it takes for a drug to degrade
Can be used to model the time it takes for a customer to complete
Outliers can be identified using statistical methods such as box plots, z-scores, and scatter plots.
Use box plots to visualize the distribution of data and identify any points that fall outside the whiskers.
Calculate z-scores to identify data points that are more than 3 standard deviations away from the mean.
Use scatter plots to identify any data points that are far away from the general trend of the data.
Consider the ...
I appeared for an interview before Mar 2024.
I am proficient in various data analysis tools, including Excel, SQL, Python, and visualization software like Tableau.
Excel: Advanced functions, pivot tables, and data visualization.
SQL: Writing complex queries for data extraction and manipulation.
Python: Utilizing libraries like Pandas and NumPy for data analysis.
Tableau: Creating interactive dashboards for data visualization.
R: Statistical analysis and data visualiza
A pivot table in Excel is a data summarization tool that allows you to reorganize and summarize selected columns and rows of data.
Allows users to summarize and analyze large datasets
Can easily reorganize data by dragging and dropping fields
Provides options to calculate sums, averages, counts, etc. for data
Helps in creating interactive reports and charts
Useful for identifying trends and patterns in data
I applied via LinkedIn and was interviewed in Dec 2024. There were 2 interview rounds.
Based on my CV, they assigned me a task related to data migration.
posted on 20 Sep 2024
I applied via Naukri.com and was interviewed in Aug 2024. There was 1 interview round.
I am a dedicated and detail-oriented business analyst with a strong background in data analysis and problem-solving.
I have a Bachelor's degree in Business Administration
I have experience in conducting market research and analyzing data to identify trends
I am proficient in using tools like Excel, SQL, and Tableau for data analysis
I have excellent communication skills and can effectively collaborate with cross-functional
posted on 23 Sep 2023
I applied via Naukri.com and was interviewed in Aug 2023. There were 3 interview rounds.
They will give you to study Abode Magento to study.
They will ask you create a presentation and to present it.
Even after getting the job offer and joining the company please don't think that they will confirm your position. Anytime the higher management will ask you to give test and if you don't perform they will ask you to leave.
I prioritize stakeholder requirements through effective communication, negotiation, and alignment with project goals.
Identify and document all stakeholder requirements to understand the full scope.
Facilitate meetings to discuss conflicting requirements and encourage open dialogue.
Use prioritization techniques, such as MoSCoW (Must have, Should have, Could have, Won't have), to assess the importance of each requirement.
...
posted on 29 Feb 2024
I applied via Approached by Company and was interviewed before Feb 2023. There were 3 interview rounds.
ETL stands for Extract, Transform, Load. It is a process used in data warehousing to extract data from various sources, transform it into a consistent format, and load it into a target database.
ETL stands for Extract, Transform, Load
Extract: Involves extracting data from various sources such as databases, applications, and files
Transform: Involves cleaning, filtering, and transforming the extracted data into a consiste...
Overfitting occurs when a machine learning model learns the training data too well, including noise and outliers, leading to poor generalization on new data.
Overfitting happens when a model is too complex and captures noise in the training data.
It leads to poor performance on unseen data as the model fails to generalize well.
Techniques to prevent overfitting include cross-validation, regularization, and early stopping.
...
Overfitting occurs when a model learns the details and noise in the training data to the extent that it negatively impacts the model's performance on new data.
Overfitting happens when a model is too complex and captures noise in the training data.
It leads to poor generalization and high accuracy on training data but low accuracy on new data.
Techniques to prevent overfitting include cross-validation, regularization, and...
based on 1 review
Rating in categories
BI Developer
33
salaries
| ₹2 L/yr - ₹8.5 L/yr |
QA Engineer
12
salaries
| ₹2 L/yr - ₹6.5 L/yr |
Business Intelligence Developer
10
salaries
| ₹1.6 L/yr - ₹6.5 L/yr |
DOT NET Developer
7
salaries
| ₹2 L/yr - ₹5.8 L/yr |
Reliability Engineer
6
salaries
| ₹3 L/yr - ₹6 L/yr |
Accel Frontline
Northcorp Software
Elentec Power India (EPI) Pvt. Ltd.
HyScaler