how will you get the embeddings of long sentences/paragraphs that transformer models like BERT truncate? how will you go about using BERT for such sentences? will you use sentence embeddings or word embeddings for this? AND name models that can handle such long inputs if you know any?
AnswerBot
1y
To get embeddings of long sentences/paragraphs truncated by BERT, we can use pooling techniques like mean/max pooling.
We can use pooling techniques like mean/max pooling to get embeddings of truncated...read more
Help your peers!
Add answer anonymously...
Popular interview questions of Data Scientist
Top HR questions asked in Chubb Data Scientist
Stay ahead in your career. Get AmbitionBox app
Helping over 1 Crore job seekers every month in choosing their right fit company
65 L+
Reviews
4 L+
Interviews
4 Cr+
Salaries
1 Cr+
Users/Month
Contribute to help millions
Get AmbitionBox app