gsi-upm / senpy

A sentiment and emotion analysis server in Python
http://senpy.gsi.upm.es
Apache License 2.0
71 stars 25 forks source link

Response is so slow #41

Closed wiedersehne closed 6 years ago

wiedersehne commented 6 years ago

I conduct analyze through c = Client('http://senpy.cluster.gsi.dit.upm.es/api'), but the response speed is so slow. Is there any tricks or if I should fill the parameter with localhost:xxxx. If right, please show me how, thank you!!!!

balkian commented 6 years ago

The public endpoint you are using is limited, and should only be used for testing and demonstration purposes. For more intensive applications, you are encouraged to run your own local instance (most of the code is open source).

If you are a company and need any of our proprietary plugins, let us know so we can discuss alternatives.

I'm closing this issue. Feel free to open a new one if you run into performance issues with your own instance.

wiedersehne commented 6 years ago

I am a master student. I have to do the emotion analysis for my research. Can you please help me to make it! Thank you!

balkian commented 6 years ago

I have just read your reply to this issue, my apologies.

The public endpoints can be used for moderate loads. We have students use them all the time. If you need to classify many texts, you can deploy a local instance. I suggest you take a look at some of our open source plugins: https://github.com/gsi-upm/senpy-plugins-community

Unfortunately, the process is not always as seamless as we would like it to be. Although our code is open source, we cannot redistribute some of the resources we are using because they are licensed. We've tried to add pointers to the original resources, so you should be able to download them and run the service yourself.