dmmiller612 / bert-extractive-summarizer

Easy to use extractive text summarization with BERT
MIT License
1.38k stars 305 forks source link

Run on tpu? #93

Open agenius5 opened 3 years ago

agenius5 commented 3 years ago

Is it possible to run this model on google colab tpu's? It's quite slow even on colab's p100 gpu when I summarize lengthy piece of texts.

I tried changing the device to self.device = xm.xla_device() inside bert_parent.py, imported the library which was successful. But, when I ran the model, there was no speed but it only slowed down.

So, is it possible to run it on tpu. If yes, then how?

it'll be great help if it works.

Thanks.