facebookresearch / InferSent

InferSent sentence embeddings
Other
2.28k stars 471 forks source link

Time Consuming #57

Closed rashikaanand closed 6 years ago

rashikaanand commented 6 years ago

Hi So i was testing the inferSent model with 20k sentences and it took around 2 hours to give results. Can you tell me Why is it time consuming ?

aconneau commented 6 years ago

Are you using the CPU or GPU? The model uses a BiLSTM with a large number of hidden units and thus involves heavy matrix computation which are much faster on the GPU. On a GPU (e.g K40), encoding 20k sentences should take around 25 seconds.

aconneau commented 6 years ago

Please re-open if you still have the issue.