Open jeromemassot opened 4 years ago
@jeromemassot I don't think you can run it on Colab (as it is).
The library starts a big java server (CoreNLP server) locally and then sends requests. I don't know how Colab works but chances are you can't do it because the ports are closed...
One way to make it work would be to host the java server on a remote server (for example AWS, GCP) and give the IP:port to CoreNLPClient (argument endpoint
), with start_server=False
. If you have time to try it, it could be great to see if it works.
Now the wrapper is responsible for starting the java server and converts requests from python > java. If we can decouple the two and only run the "converter" on Colab, chances that it works are fairly high imo.
Related issue: https://github.com/philipperemy/Stanford-OpenIE-Python/issues/21.
Hi Phillipe!! I tried to run the code given in readme in jupyter notebook after installing openie package.I got this error : FileNotFoundError: [WinError 2] The system cannot find the file specified. Can you please help me out?I am a newbie to python!! So please excuse me if I have made a very silly error!! Thanks
Hi @philipperemy , I have been running OpenIE from my local Jupyter notebook for the past month and it worked fine. I'm getting timeout errors today, was wondering if the server is down or something. If so, can you please provide a fix/workaround? Thanks.
PermanentlyFailedException: Timed out waiting for service to come alive.
Hi Philippe,
I try to use the wrapper from Colab but I have a permanent issue when it tries to start the remote server. PermanentlyFailedException: Timed out waiting for service to come alive.
I have installed the CoreNLP library with the English .jar and update the CLASSPATH, so if I can point to my local java module, it may work more easily than pointing to a remote server.
Is it possible to custom Stanford-OpenIE-Python wrapper to use a local install ?
Thanks
Best regards
Jerome