i
Imarticus Learning
Filter interviews by
Top trending discussions
I was interviewed in Sep 2024.
Find the greatest number from an array of strings.
Convert the array of strings to an array of integers.
Use a sorting algorithm to sort the array in descending order.
Return the first element of the sorted array as the greatest number.
I applied via Monster and was interviewed in Oct 2023. There were 5 interview rounds.
Python Coding Test to test general knowledge on progamming
Bias-variance tradeoff is the balance between model complexity and generalization error.
Bias refers to error from erroneous assumptions in the learning algorithm, leading to underfitting.
Variance refers to error from sensitivity to fluctuations in the training data, leading to overfitting.
Increasing model complexity reduces bias but increases variance, while decreasing complexity increases bias but reduces variance.
The...
I applied via Referral and was interviewed before Mar 2023. There were 2 interview rounds.
Classification metrics like accuracy, precision, and recall are used to evaluate the performance of a classification model.
Accuracy measures the overall correctness of the model's predictions.
Precision measures the proportion of true positive predictions out of all positive predictions.
Recall measures the proportion of true positive predictions out of all actual positive instances.
Bagging and boosting are ensemble learning techniques used to improve the performance of machine learning models by combining multiple weak learners.
Bagging (Bootstrap Aggregating) involves training multiple models independently on different subsets of the training data and then combining their predictions through averaging or voting.
Boosting involves training multiple models sequentially, where each subsequent model c...
R-squared measures the proportion of variance explained by the model, while Adjusted R-squared adjusts for the number of predictors in the model.
R-squared is the proportion of variance in the dependent variable that is predictable from the independent variables. It ranges from 0 to 1, with 1 indicating a perfect fit.
Adjusted R-squared penalizes the addition of unnecessary predictors to the model, providing a more accur...
Feature selection techniques help in selecting the most relevant features for building predictive models.
Filter methods: Select features based on statistical measures like correlation, chi-squared test, etc.
Wrapper methods: Use a specific model to evaluate the importance of features by adding or removing them iteratively.
Embedded methods: Feature selection is integrated into the model training process, like LASSO regre...
Various types of joins in SQL include inner join, outer join, left join, right join, and full join.
Inner join: Returns rows when there is a match in both tables.
Outer join: Returns all rows when there is a match in one of the tables.
Left join: Returns all rows from the left table and the matched rows from the right table.
Right join: Returns all rows from the right table and the matched rows from the left table.
Full joi...
I applied via Company Website and was interviewed before Sep 2021. There were 6 interview rounds.
The coding test was a Hackerank test with 3 python and 2 SQL questions.
Central Limit Theorem states that the sampling distribution of the mean of any independent, random variable will be normal or nearly normal.
The theorem applies to large sample sizes.
It is a fundamental concept in statistics.
It is used to estimate population parameters from sample statistics.
It is important in hypothesis testing and confidence intervals.
Example: If we take a large number of samples of the same size from...
Gradient descent is an iterative optimization algorithm used to minimize a cost function by adjusting model parameters.
Gradient descent is used in machine learning to optimize models.
It works by iteratively adjusting model parameters to minimize a cost function.
The algorithm calculates the gradient of the cost function and moves in the direction of steepest descent.
There are different variants of gradient descent, such...
Image segmentation is the process of dividing an image into multiple segments or regions.
It is used in computer vision to identify and separate objects or regions of interest in an image.
It can be done using various techniques such as thresholding, clustering, edge detection, and region growing.
Applications include object recognition, medical imaging, and autonomous vehicles.
Examples include separating the foreground a...
Object detection using CNN involves training a neural network to identify and locate objects within an image.
CNNs use convolutional layers to extract features from images
These features are then passed through fully connected layers to classify and locate objects
Common architectures for object detection include YOLO, SSD, and Faster R-CNN
Analyze a scenario for the reduce in sales of a product in the end of the month.
I was interviewed in Sep 2024.
Find the greatest number from an array of strings.
Convert the array of strings to an array of integers.
Use a sorting algorithm to sort the array in descending order.
Return the first element of the sorted array as the greatest number.
I applied via Monster and was interviewed in Oct 2023. There were 5 interview rounds.
Python Coding Test to test general knowledge on progamming
Bias-variance tradeoff is the balance between model complexity and generalization error.
Bias refers to error from erroneous assumptions in the learning algorithm, leading to underfitting.
Variance refers to error from sensitivity to fluctuations in the training data, leading to overfitting.
Increasing model complexity reduces bias but increases variance, while decreasing complexity increases bias but reduces variance.
The...
I applied via Referral and was interviewed before Mar 2023. There were 2 interview rounds.
Classification metrics like accuracy, precision, and recall are used to evaluate the performance of a classification model.
Accuracy measures the overall correctness of the model's predictions.
Precision measures the proportion of true positive predictions out of all positive predictions.
Recall measures the proportion of true positive predictions out of all actual positive instances.
Bagging and boosting are ensemble learning techniques used to improve the performance of machine learning models by combining multiple weak learners.
Bagging (Bootstrap Aggregating) involves training multiple models independently on different subsets of the training data and then combining their predictions through averaging or voting.
Boosting involves training multiple models sequentially, where each subsequent model c...
R-squared measures the proportion of variance explained by the model, while Adjusted R-squared adjusts for the number of predictors in the model.
R-squared is the proportion of variance in the dependent variable that is predictable from the independent variables. It ranges from 0 to 1, with 1 indicating a perfect fit.
Adjusted R-squared penalizes the addition of unnecessary predictors to the model, providing a more accur...
Feature selection techniques help in selecting the most relevant features for building predictive models.
Filter methods: Select features based on statistical measures like correlation, chi-squared test, etc.
Wrapper methods: Use a specific model to evaluate the importance of features by adding or removing them iteratively.
Embedded methods: Feature selection is integrated into the model training process, like LASSO regre...
Various types of joins in SQL include inner join, outer join, left join, right join, and full join.
Inner join: Returns rows when there is a match in both tables.
Outer join: Returns all rows when there is a match in one of the tables.
Left join: Returns all rows from the left table and the matched rows from the right table.
Right join: Returns all rows from the right table and the matched rows from the left table.
Full joi...
I applied via Company Website and was interviewed before Sep 2021. There were 6 interview rounds.
The coding test was a Hackerank test with 3 python and 2 SQL questions.
Central Limit Theorem states that the sampling distribution of the mean of any independent, random variable will be normal or nearly normal.
The theorem applies to large sample sizes.
It is a fundamental concept in statistics.
It is used to estimate population parameters from sample statistics.
It is important in hypothesis testing and confidence intervals.
Example: If we take a large number of samples of the same size from...
Gradient descent is an iterative optimization algorithm used to minimize a cost function by adjusting model parameters.
Gradient descent is used in machine learning to optimize models.
It works by iteratively adjusting model parameters to minimize a cost function.
The algorithm calculates the gradient of the cost function and moves in the direction of steepest descent.
There are different variants of gradient descent, such...
Image segmentation is the process of dividing an image into multiple segments or regions.
It is used in computer vision to identify and separate objects or regions of interest in an image.
It can be done using various techniques such as thresholding, clustering, edge detection, and region growing.
Applications include object recognition, medical imaging, and autonomous vehicles.
Examples include separating the foreground a...
Object detection using CNN involves training a neural network to identify and locate objects within an image.
CNNs use convolutional layers to extract features from images
These features are then passed through fully connected layers to classify and locate objects
Common architectures for object detection include YOLO, SSD, and Faster R-CNN
Analyze a scenario for the reduce in sales of a product in the end of the month.
I applied via Referral and was interviewed in Oct 2021. There were 5 interview rounds.
Ensemble techniques combine multiple models to improve prediction accuracy.
Ensemble techniques can be used with various types of models, such as decision trees, neural networks, and support vector machines.
Common ensemble techniques include bagging, boosting, and stacking.
Bagging involves training multiple models on different subsets of the data and combining their predictions through averaging or voting.
Boosting invol...
Ensemble techniques combine multiple models to improve prediction accuracy.
Bagging: Bootstrap Aggregating
Boosting: AdaBoost, Gradient Boosting
Stacking: Meta-model combines predictions of base models
Voting: Combining predictions of multiple models by majority voting
Bagging is a technique used in machine learning to improve the stability and accuracy of a model by combining multiple models.
Bagging stands for Bootstrap Aggregating.
It involves creating multiple subsets of the original dataset by randomly sampling with replacement.
Each subset is used to train a separate model, and the final prediction is the average of all the predictions made by each model.
Bagging reduces overfittin...
Boosting is an ensemble learning technique that combines multiple weak models to create a strong model.
Boosting iteratively trains weak models on different subsets of data
Each subsequent model focuses on the misclassified data points of the previous model
Final prediction is made by weighted combination of all models
Examples include AdaBoost, Gradient Boosting, XGBoost
Bias is error due to erroneous assumptions in the learning algorithm. Variance is error due to sensitivity to small fluctuations in the training set.
Bias is the difference between the expected prediction of the model and the correct value that we are trying to predict.
Variance is the variability of model prediction for a given data point or a value which tells us spread of our data.
High bias can cause an algorithm to m...
Classification techniques are used to categorize data into different classes or groups based on certain features or attributes.
Common classification techniques include decision trees, logistic regression, k-nearest neighbors, and support vector machines.
Classification can be binary (two classes) or multi-class (more than two classes).
Evaluation metrics for classification include accuracy, precision, recall, and F1 scor...
Random forest is an ensemble learning method for classification, regression and other tasks.
Random forest builds multiple decision trees and combines their predictions to improve accuracy.
It uses bagging technique to create multiple subsets of data and features for each tree.
Random forest reduces overfitting and is robust to outliers and missing values.
It can handle high-dimensional data and is easy to interpret featur...
Senior Manager
18
salaries
| ₹10 L/yr - ₹16.5 L/yr |
Program Manager
17
salaries
| ₹6.5 L/yr - ₹10 L/yr |
Assistant Manager
16
salaries
| ₹4.8 L/yr - ₹8.3 L/yr |
Senior Analyst
16
salaries
| ₹4.2 L/yr - ₹7.5 L/yr |
Career Advisor
14
salaries
| ₹2.3 L/yr - ₹5 L/yr |
Simplilearn
upGrad
Great Learning
Jigsaw Academy