ProjetPP / Documentation

Documentation and protocol specification of the Projet Pensées Profondes
Creative Commons Zero v1.0 Universal
7 stars 1 forks source link

AssertionError in libmodule-python when decoding the request. #70

Open kaushal2161 opened 7 years ago

kaushal2161 commented 7 years ago

Hi, i am writing one module to give mathematical formula for given question but i am having some trouble with json decoder. i got following error when i tried to post request and also got 500 -Internal server error. My module is running completely fine with the use of classes to send request but with the use of post it gives error. CMD :python3 -m ppp_cli --api "http://localhost:8080/" --parse "(ENS de Lyon, location, ?)"

Traceback (most recent call last):
  File "/usr/lib/python3.5/runpy.py", line 184, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.5/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/home/kaushal/workspace1/askplatyp_formula_retrieval/ppp_cli/__main__.py", line 29, in <module>
    main.main(args.api, args.id, args.language, args.parse, args.request, args.dot)
  File "/home/kaushal/workspace1/askplatyp_formula_retrieval/ppp_cli/main.py", line 59, in main
    data = requests.post(api, data=request.as_json()).json()
  File "/home/kaushal/.local/lib/python3.5/site-packages/requests/models.py", line 826, in json
    return complexjson.loads(self.text, **kwargs)
  File "/usr/lib/python3.5/json/__init__.py", line 319, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python3.5/json/decoder.py", line 339, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python3.5/json/decoder.py", line 357, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

I have search for internal server error but nothing helped to solve my error. Any suggestions how to deal with this problem?

progval commented 7 years ago

Hi,

The traceback you give is on the client-side error, and caused by the 500 error, so this traceback does not help.

However, I see you originally submitted the server-side traceback, which is the cause of the 500 error:

File "/home/kaushal/workspace1/askplatyp_formula_retrieval/ppp_libmodule/http.py", line 124, in on_post
    return self.process_request(request)
  File "/home/kaushal/workspace1/askplatyp_formula_retrieval/ppp_libmodule/http.py", line 102, in process_request
    request = Request.from_json(request.read().decode())
  File "/home/kaushal/workspace1/askplatyp_formula_retrieval/ppp_datamodel/utils/serializableattributesholder.py", line 34, in from_json
    assert isinstance(data, str)
AssertionError

It means that the construction of the Request object from the data sent by the client failed (Request.from_json(request.read().decode())) because it expected a string but got something else (assert isinstance(data, str)).

This is likely an error in our code, not yours. I will investigate that.

progval commented 7 years ago

I cannot reproduce the error. Can you give me the output of pip3 freeze?

kaushal2161 commented 7 years ago

apparmor==2.10.95 apparmor.click==0.3.13.1 apt-xapian-index==0.47 apturl==0.5.2 aspell-python-py3==1.15 beautifulsoup4==4.4.1 blinker==1.3 Brlapi==0.6.4 chardet==2.3.0 checkbox-ng==0.23 checkbox-support==0.22 click==0.4.43+16.4.20160203.0ubuntu2 colorama==0.3.7 command-not-found==0.3 cryptography==1.2.3 defer==1.0.6 distlib==0.2.2 example-ppp-module==0.3.1 feedparser==5.1.3 friends==0.1 guacamole==0.9.2 html5lib==0.999 httplib2==0.9.1 idna==2.0 isodate==0.5.4 Jinja2==2.8 jsonrpclib-pelix==0.2.8 language-selector==0.1 LibAppArmor==2.10.95 louis==2.6.4 lxml==3.5.0 Mako==1.0.3 MarkupSafe==0.23 mpmath==0.19 nltk==3.2.1 numpy==1.11.1 oauthlib==1.0.3 onboard==1.2.0 oneconf==0.3.9 padme==1.1.1 pexpect==4.0.1 Pillow==3.1.2 piston-mini-client==0.7.5 plainbox==0.25 ply==3.9 ppp-cas==0.8 ppp-french-parser==0.1.4 ppp-libmodule==0.7.7 ppp-logger==0.2.2 ppp-natural-math==0.3 ppp-questionparsing-ml-standalone==0.4 ppp-spell-checker==0.2.3 ptyprocess==0.5 pyasn1==0.1.9 pycrypto==2.6.1 pycups==1.9.73 pycurl==7.43.0 pygobject==3.20.0 PyJWT==1.3.0 pyparsing==2.0.3 python-apt==1.1.0b1 python-debian==0.1.27 python-systemd==231 python3-memcached==1.51 pyxdg==0.25 rdflib==4.1.2 recordtype==1.1 reportlab==3.3.0 repoze.lru==0.6 requests==2.11.1 sessioninstaller==0.0.0 six==1.10.0 software-center-aptd-plugins==0.0.0 SPARQLWrapper==1.6.4 SQLAlchemy==1.0.14 sympy==1.0 system-service==0.3 ubuntu-drivers-common==0.0.0 ufw==0.35 unattended-upgrades==0.1 unity-scope-audacious==0.1 unity-scope-calculator==0.1 unity-scope-chromiumbookmarks==0.1 unity-scope-clementine==0.1 unity-scope-colourlovers==0.1 unity-scope-devhelp==0.1 unity-scope-firefoxbookmarks==0.1 unity-scope-gdrive==0.7 unity-scope-gmusicbrowser==0.1 unity-scope-gourmet==0.1 unity-scope-guayadeque==0.1 unity-scope-manpages==0.1 unity-scope-musique==0.1 unity-scope-openclipart==0.1 unity-scope-texdoc==0.1 unity-scope-tomboy==0.1 unity-scope-virtualbox==0.1 unity-scope-yelp==0.1 unity-scope-zotero==0.1 urllib3==1.13.1 usb-creator==0.3.0 xdiagnose==3.8.4 xkit==0.0.0 XlsxWriter==0.7.3

kaushal2161 commented 7 years ago

i am getting json for another quetion as follow: {"language":"en","id":"1473034964275-275-84-webui","tree":{"type":"sentence","value":"what is area of circle"},"measures":{"relevance":0,"accuracy":1},"trace":[{"module":"input","tree":{"type":"sentence","value":"what is area of circle"},"measures":{"relevance":0,"accuracy":1}}]}

progval commented 7 years ago

Could you try with release versions of ppp_libmodule and ppp_datamodel, instead of git master?

kaushal2161 commented 7 years ago

I will try with release versions of model and let you know if error is still there or not. Thanks..

kaushal2161 commented 7 years ago

As i m getting dict instead of str in previous error, i have change following: def from_json(cls, data): assert ## isinstance(str(data), str) data = json.loads(data) return cls.from_dict(data) Then i am getting following error: `` WARNING:router:Module _Module(name=u'nlp_classical', url=u'http://localhost:9000/', coefficient=1, filters={u'whitelist': [u'sentence']}, method='http') returned 500: java.lang.RuntimeException: java.lang.RuntimeException: Error initializing coref system

WARNING:router:Module _Module(name=u'ask_formularetrieval', url=u'http://localhost:9001/', coefficient=1, filters={}, method='http') returned 500: Internal server error. Sorry :/

WARNING:router:Module _Module(name=u'platypus_core', url=u'http://core.frontend.askplatyp.us/', coefficient=1, filters={}, method='http') returned 405: Bad method, only POST is supported. See: https://github.com/ProjetPP/Documentation/blob/master/module-communication.md#frontend ``

Am i doing wrong somewhere? As i have tried with many releases for both modules but getting the same error.

progval commented 7 years ago

The first two errors are caused by the NLP. Ping @Ezibenroc. The last one was caused by an outdated part of the doc: you need HTTPS to connect to Platypus' core (fixed in 01d9c640cd90be58aafbd766d824cebaea601a6a)

kaushal2161 commented 7 years ago

Any update regarding this error? As i am getting the same error again. `` Traceback (most recent call last): File "/home/kaushal/workspace1/askplatyp_formula_retrieval/ppp_libmodule/http.py", line 98, in on_post def process_request(self, request): File "/home/kaushal/workspace1/askplatyp_formula_retrieval/ppp_libmodule/http.py", line 80, in process_request get_process_time = getattr(time, 'process_time', None) File "/home/kaushal/workspace1/askplatyp_formula_retrieval/ppp_datamodel/communication/request.py", line 38, in from_json for x in attributes.get('trace', [])] AssertionError

``

kaushal2161 commented 7 years ago

Found the root of this error but don't know how to fix it . May be you can give a try to fix this. Its related Core NLP server, I am getting following error, Adding annotator mention java.io.IOException: Unable to open "edu/stanford/nlp/models/coref/hybrid/md-model-dep.ser.gz" as class path, filename or URL edu.stanford.nlp.io.IOUtils.getInputStreamFromURLOrClasspathOrFileSystem(IOUtils.java:478) edu.stanford.nlp.io.IOUtils.readObjectFromURLOrClasspathOrFileSystem(IOUtils.java:317) edu.stanford.nlp.coref.md.DependencyCorefMentionFinder.<init>(DependencyCorefMentionFinder.java:40) edu.stanford.nlp.pipeline.MentionAnnotator.getMentionFinder(MentionAnnotator.java:131) edu.stanford.nlp.pipeline.MentionAnnotator.<init>(MentionAnnotator.java:64) edu.stanford.nlp.pipeline.AnnotatorImplementations.mention(AnnotatorImplementations.java:217) edu.stanford.nlp.pipeline.AnnotatorFactories$12.create(AnnotatorFactories.java:380) edu.stanford.nlp.pipeline.AnnotatorPool.get(AnnotatorPool.java:152) edu.stanford.nlp.pipeline.StanfordCoreNLP.construct(StanfordCoreNLP.java:452) edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:155) edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:146) edu.stanford.nlp.pipeline.StanfordCoreNLPServer.mkStanfordCoreNLP(StanfordCoreNLPServer.java:231) edu.stanford.nlp.pipeline.StanfordCoreNLPServer.access$500(StanfordCoreNLPServer.java:41) edu.stanford.nlp.pipeline.StanfordCoreNLPServer$CoreNLPHandler.handle(StanfordCoreNLPServer.java:477) com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:79) sun.net.httpserver.AuthFilter.doFilter(AuthFilter.java:83) com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:82) sun.net.httpserver.ServerImpl$Exchange$LinkHandler.handle(ServerImpl.java:675) com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:79) sun.net.httpserver.ServerImpl$Exchange.run(ServerImpl.java:647) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) java.lang.Thread.run(Thread.java:745) Adding annotator natlog Adding annotator coref cannot create CorefAnnotator! java.lang.RuntimeException: Error initializing coref system at edu.stanford.nlp.coref.CorefSystem.<init>(CorefSystem.java:36) at edu.stanford.nlp.pipeline.CorefAnnotator.<init>(CorefAnnotator.java:55) at edu.stanford.nlp.pipeline.AnnotatorImplementations.coref(AnnotatorImplementations.java:227) at edu.stanford.nlp.pipeline.AnnotatorFactories$13.create(AnnotatorFactories.java:400) at edu.stanford.nlp.pipeline.AnnotatorPool.get(AnnotatorPool.java:152) at edu.stanford.nlp.pipeline.StanfordCoreNLP.construct(StanfordCoreNLP.java:452) at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:155) at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:146) at edu.stanford.nlp.pipeline.StanfordCoreNLPServer.mkStanfordCoreNLP(StanfordCoreNLPServer.java:231) at edu.stanford.nlp.pipeline.StanfordCoreNLPServer.access$500(StanfordCoreNLPServer.java:41) at edu.stanford.nlp.pipeline.StanfordCoreNLPServer$CoreNLPHandler.handle(StanfordCoreNLPServer.java:477) at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:79) at sun.net.httpserver.AuthFilter.doFilter(AuthFilter.java:83) at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:82) at sun.net.httpserver.ServerImpl$Exchange$LinkHandler.handle(ServerImpl.java:675) at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:79) at sun.net.httpserver.ServerImpl$Exchange.run(ServerImpl.java:647) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: Error loading word counts at edu.stanford.nlp.coref.statistical.FeatureExtractor.loadVocabulary(FeatureExtractor.java:95) at edu.stanford.nlp.coref.statistical.FeatureExtractor.<init>(FeatureExtractor.java:73) at edu.stanford.nlp.coref.statistical.StatisticalCorefAlgorithm.<init>(StatisticalCorefAlgorithm.java:59) at edu.stanford.nlp.coref.statistical.StatisticalCorefAlgorithm.<init>(StatisticalCorefAlgorithm.java:40) at edu.stanford.nlp.coref.CorefAlgorithm.fromProps(CorefAlgorithm.java:26) at edu.stanford.nlp.coref.CorefSystem.<init>(CorefSystem.java:33) ... 19 more Caused by: java.io.IOException: Unable to open "edu/stanford/nlp/models/coref/statistical/word_counts.ser.gz" as class path, filename or URL at edu.stanford.nlp.io.IOUtils.getInputStreamFromURLOrClasspathOrFileSystem(IOUtils.java:478) at edu.stanford.nlp.io.IOUtils.readObjectFromURLOrClasspathOrFileSystem(IOUtils.java:317) at edu.stanford.nlp.coref.statistical.FeatureExtractor.loadVocabulary(FeatureExtractor.java:88) ... 24 more java.lang.RuntimeException: java.lang.RuntimeException: Error initializing coref system at edu.stanford.nlp.pipeline.CorefAnnotator.<init>(CorefAnnotator.java:60) at edu.stanford.nlp.pipeline.AnnotatorImplementations.coref(AnnotatorImplementations.java:227) at edu.stanford.nlp.pipeline.AnnotatorFactories$13.create(AnnotatorFactories.java:400) at edu.stanford.nlp.pipeline.AnnotatorPool.get(AnnotatorPool.java:152) at edu.stanford.nlp.pipeline.StanfordCoreNLP.construct(StanfordCoreNLP.java:452) at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:155) at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:146) at edu.stanford.nlp.pipeline.StanfordCoreNLPServer.mkStanfordCoreNLP(StanfordCoreNLPServer.java:231) at edu.stanford.nlp.pipeline.StanfordCoreNLPServer.access$500(StanfordCoreNLPServer.java:41) at edu.stanford.nlp.pipeline.StanfordCoreNLPServer$CoreNLPHandler.handle(StanfordCoreNLPServer.java:477) at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:79) at sun.net.httpserver.AuthFilter.doFilter(AuthFilter.java:83) at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:82) at sun.net.httpserver.ServerImpl$Exchange$LinkHandler.handle(ServerImpl.java:675) at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:79) at sun.net.httpserver.ServerImpl$Exchange.run(ServerImpl.java:647) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: Error initializing coref system at edu.stanford.nlp.coref.CorefSystem.<init>(CorefSystem.java:36) at edu.stanford.nlp.pipeline.CorefAnnotator.<init>(CorefAnnotator.java:55) ... 18 more Caused by: java.lang.RuntimeException: Error loading word counts at edu.stanford.nlp.coref.statistical.FeatureExtractor.loadVocabulary(FeatureExtractor.java:95) at edu.stanford.nlp.coref.statistical.FeatureExtractor.<init>(FeatureExtractor.java:73) at edu.stanford.nlp.coref.statistical.StatisticalCorefAlgorithm.<init>(StatisticalCorefAlgorithm.java:59) at edu.stanford.nlp.coref.statistical.StatisticalCorefAlgorithm.<init>(StatisticalCorefAlgorithm.java:40) at edu.stanford.nlp.coref.CorefAlgorithm.fromProps(CorefAlgorithm.java:26) at edu.stanford.nlp.coref.CorefSystem.<init>(CorefSystem.java:33) ... 19 more Caused by: java.io.IOException: Unable to open "edu/stanford/nlp/models/coref/statistical/word_counts.ser.gz" as class path, filename or URL at edu.stanford.nlp.io.IOUtils.getInputStreamFromURLOrClasspathOrFileSystem(IOUtils.java:478) at edu.stanford.nlp.io.IOUtils.readObjectFromURLOrClasspathOrFileSystem(IOUtils.java:317) at edu.stanford.nlp.coref.statistical.FeatureExtractor.loadVocabulary(FeatureExtractor.java:88) ... 24 more

I have tried to use Corenlp server with ppp_cli modude for the same question use in webUI. There it works fine and giving output with this input:: python3 -m ppp_cli --api "http://localhost:9001/" --parse "what is volume of sphere?" output: [<Response {'measures': {'relevance': 1, 'accuracy': 1}, 'tree': <PPP node "resource" {'value': 'V = \\frac{4}{3} \\pi \\cdot r^3'}>, 'trace': [<TraceItem {'measures': {'relevance': 1, 'accuracy': 1}, 'tree': <PPP node "resource" {'value': 'V = \\frac{4}{3} \\pi \\cdot r^3'}>, 'times': {}, 'module': 'ask_formularetrieval'}>], 'language': 'en'}>] Hope this helps to solve the issue Ping @Ezibenroc . Looking forward for your response.