When to use Relu and when not?

AnswerBot
7mo

Use ReLU for hidden layers in deep neural networks, avoid for output layers.

  • ReLU is commonly used in hidden layers to introduce non-linearity and speed up convergence.

  • Avoid using ReLU in output layers...read more

Help your peers!
Add answer anonymously...
ServiceNow Data Scientist Interview Questions
Stay ahead in your career. Get AmbitionBox app
qr-code
Helping over 1 Crore job seekers every month in choosing their right fit company
65 L+

Reviews

4 L+

Interviews

4 Cr+

Salaries

1 Cr+

Users/Month

Contribute to help millions

Made with ❤️ in India. Trademarks belong to their respective owners. All rights reserved © 2024 Info Edge (India) Ltd.

Follow us
  • Youtube
  • Instagram
  • LinkedIn
  • Facebook
  • Twitter