What is dropout and batch normalization?

AnswerBot
7mo

Dropout is a regularization technique to prevent overfitting by randomly setting some neuron outputs to zero during training. Batch normalization is a technique to improve the training of deep neural ...read more

Help your peers!
Add answer anonymously...
Intersoft Data Labs Machine Learning Software Engineer Interview Questions
Stay ahead in your career. Get AmbitionBox app
qr-code
Helping over 1 Crore job seekers every month in choosing their right fit company
65 L+

Reviews

4 L+

Interviews

4 Cr+

Salaries

1 Cr+

Users/Month

Contribute to help millions

Made with ❤️ in India. Trademarks belong to their respective owners. All rights reserved © 2024 Info Edge (India) Ltd.

Follow us
  • Youtube
  • Instagram
  • LinkedIn
  • Facebook
  • Twitter