what is regularisation

AnswerBot
1y

Regularisation is a technique used in machine learning to prevent overfitting by adding a penalty term to the loss function.

  • Regularisation helps to control the complexity of a model and reduce the imp...read more

Aryan
1y

Regularization refers to subtle changes in the model architecture that we implement to reduce the chances of overfitting. Overfitting is observed when the model performs well on training data but fails to generalize well over previously unseen data.

Some common methods include :
L1 regularization
L2 regularization
Early Stopping
Dropout

L1 and L2 are primarily used in regression use cases and penalize large weights. In the loss function, a penalty term is added which can be controlled via a hyperparameter that controls the importance we give to the regularizer term. L1 has the term Σw in the loss function, whereas L2 has the term Σw² in the loss function.

Early Stopping refers to stopping the training when performance on the validation set stops improving. The validation set is part of the dataset kept aside for testing during training to observe the performance of the training.

Dropout is majorly used in sequential neural networks, and it stochastically mutes a certain percentage of neurons, thereby creating a simpler neural network and forcing the neurons to learn independently.

Help your peers!
Add answer anonymously...
PayU Payments Data Scientist Interview Questions
Stay ahead in your career. Get AmbitionBox app
qr-code
Helping over 1 Crore job seekers every month in choosing their right fit company
65 L+

Reviews

4 L+

Interviews

4 Cr+

Salaries

1 Cr+

Users/Month

Contribute to help millions
Get AmbitionBox app

Made with ❤️ in India. Trademarks belong to their respective owners. All rights reserved © 2024 Info Edge (India) Ltd.

Follow us
  • Youtube
  • Instagram
  • LinkedIn
  • Facebook
  • Twitter