Filter interviews by
I applied via campus placement at Indian Institute of Technology (IIT), Chennai and was interviewed in Dec 2016. There were 6 interview rounds.
Hypothesis testing is a statistical method to determine if there is enough evidence to support or reject a claim.
Hypothesis testing involves formulating a null hypothesis and an alternative hypothesis.
The null hypothesis assumes that there is no significant difference or relationship between variables.
The alternative hypothesis suggests that there is a significant difference or relationship between variables.
Distributi...
A string can be reversed without affecting memory size by swapping characters from both ends.
Iterate through half of the string length
Swap the characters at the corresponding positions from both ends
Gradient boosting is a machine learning technique that combines multiple weak models to create a strong predictive model.
Gradient boosting is an ensemble method that iteratively adds new models to correct the errors made by previous models.
It is a type of boosting algorithm that focuses on reducing the residual errors in predictions.
Gradient boosting uses a loss function and gradient descent to optimize the model's per...
XGBoost and AdaBoost are both boosting algorithms, but XGBoost is an optimized version of AdaBoost.
XGBoost is an optimized version of AdaBoost that uses gradient boosting.
AdaBoost combines weak learners into a strong learner by adjusting weights.
XGBoost uses a more advanced regularization technique called 'gradient boosting'.
XGBoost is known for its speed and performance in large-scale machine learning tasks.
Both algor...
Addressing skewed training data in data science
Analyze the extent of skewness in the data
Consider resampling techniques like oversampling or undersampling
Apply appropriate evaluation metrics that are robust to class imbalance
Explore ensemble methods like bagging or boosting
Use synthetic data generation techniques like SMOTE
Consider feature engineering to improve model performance
Regularize the model to avoid overfittin...
Principal Component Analysis (PCA) is a dimensionality reduction technique used to transform high-dimensional data into a lower-dimensional space.
PCA is used to identify patterns and relationships in data by reducing the number of variables.
It helps in visualizing and interpreting complex data by representing it in a simpler form.
PCA is commonly used in fields like image processing, genetics, finance, and social scienc...
The cost function for linear regression is mean squared error (MSE) and for logistic regression is log loss.
The cost function for linear regression is calculated by taking the average of the squared differences between the predicted and actual values.
The cost function for logistic regression is calculated using the logarithm of the predicted probabilities.
The goal of the cost function is to minimize the error between t...
Regularization is a technique used in machine learning to prevent overfitting by adding a penalty term to the loss function.
Regularization helps to reduce the complexity of a model by discouraging large parameter values.
It prevents overfitting by adding a penalty for complex models, encouraging simpler and more generalizable models.
Common regularization techniques include L1 regularization (Lasso), L2 regularization (R...
The objective of predictive modeling is to minimize the cost function as it helps in optimizing the model's performance.
Predictive modeling aims to make accurate predictions by minimizing the cost function.
The cost function quantifies the discrepancy between predicted and actual values.
By minimizing the cost function, the model can improve its ability to make accurate predictions.
The cost function can be defined differ...
I chose your company because of its strong reputation and the opportunity to work on diverse projects.
Your company has a strong reputation in the industry.
I am impressed by the diverse range of projects your company is involved in.
Your company offers a collaborative and innovative work environment.
I believe working at your company will provide me with valuable hands-on experience.
Your company's commitment to profession
posted on 4 Dec 2016
I applied via campus placement at Indian Institute of Technology (IIT), Chennai and was interviewed in Jan 2016. There were 5 interview rounds.
posted on 7 Sep 2023
List Comprehension, Python Programs on Palindromes, Fibonacci, factorials, reversing integers etc.
Data Scientist
10
salaries
| ₹10 L/yr - ₹21 L/yr |
Data Engineer
7
salaries
| ₹8 L/yr - ₹17.6 L/yr |
Cloud Engineer
6
salaries
| ₹9.5 L/yr - ₹14 L/yr |
Senior Data Scientist
6
salaries
| ₹18 L/yr - ₹28 L/yr |
Software Engineer
5
salaries
| ₹9.6 L/yr - ₹25 L/yr |
AXIS MY INDIA
Quantzig
GfK MODE
Edward Food Research and Analysis Centre