in what scenarios would you advice me to not use ReLU in my hidden layers?

AnswerBot
1y

Avoid ReLU when dealing with negative values or vanishing gradients.

  • When dealing with negative values, use Leaky ReLU or ELU instead.

  • When facing vanishing gradients, use other activation functions lik...read more

Help your peers!
Add answer anonymously...
Chubb Data Scientist Interview Questions
Stay ahead in your career. Get AmbitionBox app
qr-code
Helping over 1 Crore job seekers every month in choosing their right fit company
65 L+

Reviews

4 L+

Interviews

4 Cr+

Salaries

1 Cr+

Users/Month

Contribute to help millions
Get AmbitionBox app

Made with ❤️ in India. Trademarks belong to their respective owners. All rights reserved © 2024 Info Edge (India) Ltd.

Follow us
  • Youtube
  • Instagram
  • LinkedIn
  • Facebook
  • Twitter