Closed rashikaanand closed 6 years ago
Are you using the CPU or GPU? The model uses a BiLSTM with a large number of hidden units and thus involves heavy matrix computation which are much faster on the GPU. On a GPU (e.g K40), encoding 20k sentences should take around 25 seconds.
Please re-open if you still have the issue.
Hi So i was testing the inferSent model with 20k sentences and it took around 2 hours to give results. Can you tell me Why is it time consuming ?