i
Great Learning
Filter interviews by
I applied via Recruitment Consulltant and was interviewed before Feb 2023. There were 3 interview rounds.
A situation will be provided, with details about a learner's work & education profile, where final output is recommended resolution
I applied via Recruitment Consulltant and was interviewed before Mar 2023. There were 3 interview rounds.
Marketing topics/ ppt create and submit
I applied via Company Website and was interviewed before Sep 2022. There were 4 interview rounds.
You will be given with a set of questions related to profiles sourcing and onboarding. You will have to list the platforms that you use, type of profiles, the reason for selecting profiles, etc,.
I was interviewed before Jun 2023.
Great Learning interview questions for popular designations
I applied via Approached by Company and was interviewed before Nov 2022. There were 4 interview rounds.
Designing a low level architecture for a quiz portal application
Use a microservices architecture for scalability and flexibility
Implement a database schema to store quiz questions, answers, and user responses
Utilize caching mechanisms to improve performance
Design an authentication system to ensure secure access to quizzes
Include features for creating, editing, and taking quizzes
I applied via Referral and was interviewed before Oct 2022. There were 3 interview rounds.
I applied via Referral and was interviewed in Oct 2021. There were 5 interview rounds.
Ensemble techniques combine multiple models to improve prediction accuracy.
Ensemble techniques can be used with various types of models, such as decision trees, neural networks, and support vector machines.
Common ensemble techniques include bagging, boosting, and stacking.
Bagging involves training multiple models on different subsets of the data and combining their predictions through averaging or voting.
Boosting invol...
Ensemble techniques combine multiple models to improve prediction accuracy.
Bagging: Bootstrap Aggregating
Boosting: AdaBoost, Gradient Boosting
Stacking: Meta-model combines predictions of base models
Voting: Combining predictions of multiple models by majority voting
Bagging is a technique used in machine learning to improve the stability and accuracy of a model by combining multiple models.
Bagging stands for Bootstrap Aggregating.
It involves creating multiple subsets of the original dataset by randomly sampling with replacement.
Each subset is used to train a separate model, and the final prediction is the average of all the predictions made by each model.
Bagging reduces overfittin...
Boosting is an ensemble learning technique that combines multiple weak models to create a strong model.
Boosting iteratively trains weak models on different subsets of data
Each subsequent model focuses on the misclassified data points of the previous model
Final prediction is made by weighted combination of all models
Examples include AdaBoost, Gradient Boosting, XGBoost
Bias is error due to erroneous assumptions in the learning algorithm. Variance is error due to sensitivity to small fluctuations in the training set.
Bias is the difference between the expected prediction of the model and the correct value that we are trying to predict.
Variance is the variability of model prediction for a given data point or a value which tells us spread of our data.
High bias can cause an algorithm to m...
Classification techniques are used to categorize data into different classes or groups based on certain features or attributes.
Common classification techniques include decision trees, logistic regression, k-nearest neighbors, and support vector machines.
Classification can be binary (two classes) or multi-class (more than two classes).
Evaluation metrics for classification include accuracy, precision, recall, and F1 scor...
Random forest is an ensemble learning method for classification, regression and other tasks.
Random forest builds multiple decision trees and combines their predictions to improve accuracy.
It uses bagging technique to create multiple subsets of data and features for each tree.
Random forest reduces overfitting and is robust to outliers and missing values.
It can handle high-dimensional data and is easy to interpret featur...
I applied via Company Website and was interviewed before Aug 2022. There were 3 interview rounds.
Quantattive wuestions & English verbal ability
I applied via Campus Placement and was interviewed in Jan 2022. There were 3 interview rounds.
Be confident and take the lead
I applied via Referral and was interviewed in May 2021. There were 4 interview rounds.
The duration of Great Learning interview process can vary, but typically it takes about less than 2 weeks to complete.
based on 16 interviews
Interview experience
based on 423 reviews
Rating in categories
6-8 Yrs
Not Disclosed
Senior Learning Consultant
339
salaries
| ₹0 L/yr - ₹0 L/yr |
Program Manager
310
salaries
| ₹0 L/yr - ₹0 L/yr |
Learning Consultant
282
salaries
| ₹0 L/yr - ₹0 L/yr |
Team Lead
100
salaries
| ₹0 L/yr - ₹0 L/yr |
Senior Program Manager
98
salaries
| ₹0 L/yr - ₹0 L/yr |
upGrad
Simplilearn
Imarticus Learning
Jigsaw Academy