Closed sorinsfirlogea closed 10 months ago
Hey @sorinsfirlogea ,
Thank you for bringing this to our attention. I just checked everything end-to-end and I don't see any issues with the servers. Can you try deleting the english fasttext model under ~/.fasttext. There might have been an issue with the download. Also, can you check if you have sufficient system memory. The fasttext object uses about 6GB of RAM and NLP-Cube, uses 4-6 more GBs. Let me know if this helps.
There is however a small issue with the script itself. We need to push a change into cube, because the Document object is currently not iterable. You can use this simplified version instead:
from cube.api import Cube
cube=Cube(verbose=True)
cube.load("en")
text="All the faith he had had, had had no effect on the outcome of his life."
sentences=cube(text)
print(sentences)
It may be a memory problem, then. I don't have this amount of RAM on my box and this might be the explanation. Maybe you would consider adding a warning notice on NLPCube manifest about the harware requirements, it would be helpful for those curious to try it. Thanks for the quick reply.
Yes, you are right about this. You can give it a shot in a Google Colab. It should have enough RAM for this.
I have suucessfully installed NLPCube on an Ubuntu 20.04 box, running Python 3.9. I copied the example from the manifest file:
At first call it loaded the English model (~/.nlpcube/3.0/en). When it comes to running the script I consistently encounter the following error:
I have checked and the model is in place. The version of fasttext is 0.9.2.
I would appreciate some help to get over this error. Thank you in advance.