Difference between attention and self attention

AnswerBot
3mo

Attention focuses on specific parts of input data, while self attention considers relationships within the input data itself.

  • Attention is used in models like seq2seq for machine translation to focus o...read more

Help your peers!
Add answer anonymously...
TCS Senior Data Scientist Interview Questions
Stay ahead in your career. Get AmbitionBox app
qr-code
Helping over 1 Crore job seekers every month in choosing their right fit company
65 L+

Reviews

4 L+

Interviews

4 Cr+

Salaries

1 Cr+

Users/Month

Contribute to help millions
Get AmbitionBox app

Made with ❤️ in India. Trademarks belong to their respective owners. All rights reserved © 2024 Info Edge (India) Ltd.

Follow us
  • Youtube
  • Instagram
  • LinkedIn
  • Facebook
  • Twitter