how will you get the embeddings of long sentences/paragraphs that transformer models like BERT truncate? how will you go about using BERT for such sentences? will you use sentence embeddings or word embeddings for this? AND name models that can handle such long inputs if you know any?

AnswerBot
2y

To get embeddings of long sentences/paragraphs truncated by BERT, we can use pooling techniques like mean/max pooling.

  • We can use pooling techniques like mean/max pooling to get embeddings of truncated...read more

Help your peers!
Select
Add answer anonymously...

Top Data Scientist Interview Questions Asked at Chubb

Q. In what scenarios would you advise me not to use ReLU in my hidden layers?
Q. What is the universal approximation theorem?
Q. How does backpropagation in neural networks work?
Data Scientist Interview Questions
Stay ahead in your career. Get AmbitionBox app
play-icon
play-icon
qr-code
Trusted by over 1.5 Crore job seekers to find their right fit company
80 L+

Reviews

10L+

Interviews

4 Cr+

Salaries

1.5 Cr+

Users

Contribute to help millions

Made with ❤️ in India. Trademarks belong to their respective owners. All rights reserved © 2025 Info Edge (India) Ltd.

Follow Us
  • Youtube
  • Instagram
  • LinkedIn
  • Facebook
  • Twitter
Profile Image
Hello, Guest
AmbitionBox Employee Choice Awards 2025
Winners announced!
awards-icon
Contribute to help millions!
Write a review
Write a review
Share interview
Share interview
Contribute salary
Contribute salary
Add office photos
Add office photos
Add office benefits
Add office benefits