
Asked in Chubb
how will you get the embeddings of long sentences/paragraphs that transformer models like BERT truncate? how will you go about using BERT for such sentences? will you use sentence embeddings or word embeddings for this? AND name models that can handle such long inputs if you know any?

AnswerBot
2y
To get embeddings of long sentences/paragraphs truncated by BERT, we can use pooling techniques like mean/max pooling.
We can use pooling techniques like mean/max pooling to get embeddings of truncated...read more
Help your peers!
Add answer anonymously...
Top Data Scientist Interview Questions Asked at Chubb
Q. In what scenarios would you advise me not to use ReLU in my hidden layers?
Q. What is the universal approximation theorem?
Q. How does backpropagation in neural networks work?
Interview Questions Asked to Data Scientist at Other Companies
Top Skill-Based Questions for Chubb Data Scientist
Python Interview Questions and Answers
400 Questions
Machine Learning Interview Questions and Answers
250 Questions
Algorithms Interview Questions and Answers
250 Questions
SQL Interview Questions and Answers
250 Questions
Data Structures Interview Questions and Answers
250 Questions
Stay ahead in your career. Get AmbitionBox app


Trusted by over 1.5 Crore job seekers to find their right fit company
80 L+
Reviews
10L+
Interviews
4 Cr+
Salaries
1.5 Cr+
Users
Contribute to help millions
AmbitionBox Awards
Get AmbitionBox app

