Closed atotalnoob closed 5 years ago
It seems to work just fine using GET instead of POST.
Is this an error in the documentation or what?
@atotalnoob I've added some debug output to the /parse
endpoint that gets called if the json deserialisation fails. Do you mind trying your example with the latest master version and sharing the console log? (please not only copy the exception but also the logger lines before that exception)
Hey Tom, I'm having similar issue. Have used the latest master version.
Rasa Core version: 0.8.2, Rasa NLU 0.11.3 Python version: 3.6.3 , Anaconda version 5.1.0, tensorflow : 1.4.0 ( I think this needs correction in requirements.txt for rasa core as there is no pip version available for tensorflow 1.4.1 ) Operating system : Windows 7
/parse and /continue endpoints not working with curl -XPOST
/parse endpoint is somehow giving response with get: payload = {"query": "Hi" } response = requests.get('http://localhost:5005/conversations/default/parse',params=payload).json() print(response)
/continue endpoint is not working with either get or post As /continue end point is not working , unble to manage slot states.
Can we use requests.post( ) instead of curl ?
Curl Command Executed: curl -XPOST localhost:5005/conversations/default/parse -d '{"query":"hi"}'
HTTP Server Log:
================
(py36) D:\BOT\dbbot>python -m rasa_core.server -d models/dialogue -u models/nlu/default/current -o out.log
2018-03-03 23:49:38 WARNING py.warnings - C:\Users\Sibaprasad\Anaconda2\envs\py36\lib\site-packages\h5py__init__.py:34: FutureWarning: Conversion of the second argument of issubdtype from float
to np.floating
is deprecated. In future, it will be treated as np.float64 == np.dtype(float).type
.
from ._conv import register_converters as _register_converters
Using TensorFlow backend.
2018-03-03 23:50:18+0530 [-] Log opened.
2018-03-03 23:50:18+0530 [-] Site starting on 5005
2018-03-03 23:50:18+0530 [-] Starting factory <twisted.web.server.Site object at 0x00000000384086D8>
2018-03-03 23:50:37+0530 [-] 2018-03-03 23:50:37 ERROR main - Failed to decode json during parse request. Error: Expecting value: line 1 column 1 (char 0). Request content: 'b"'{query:hi}'"'
2018-03-03 23:50:37+0530 [_GenericHTTPChannelProtocol,0,127.0.0.1] Unhandled Error
Traceback (most recent call last):
File "C:\Users\Sibaprasad\Anaconda2\envs\py36\lib\site-packages\twisted\web\server.py", line 255, in render
body = resrc.render(self)
File "C:\Users\Sibaprasad\Anaconda2\envs\py36\lib\site-packages\klein\resource.py", line 210, in render
d = defer.maybeDeferred(_execute)
File "C:\Users\Sibaprasad\Anaconda2\envs\py36\lib\site-packages\twisted\internet\defer.py", line 150, in maybeDeferred
result = f(args, kw)
File "C:\Users\Sibaprasad\Anaconda2\envs\py36\lib\site-packages\klein\resource.py", line 204, in _execute
kwargs)
---
2018-03-03 23:50:37+0530 [-] "127.0.0.1" - - [03/Mar/2018:18:20:37 +0000] "POST /conversations/default/parse HTTP/1.1" 500 10107 "-" "curl/7.58.0"
Curl Log Attached for your reference. Curl.txt
=================================================
curl -XPOST localhost:5005/conversations/default/parse -d '{"query":"hello there"}'
Server Error Log: 2018-03-04 09:44:35+0530 [-] 2018-03-04 09:44:35 ERROR main - Failed to decode json during parse request. Error: Expecting value: line 1 column 1 (char 0). Request content: 'b"'{query:hello there}'"'
Hi All,
putting request in below format works for me:
curl -XPOST localhost:5005/conversations/default/parse -d "{\"query\":\"hello there\"}"
@SibaprasadM Did u got solution for http://localhost:5005/conversations/default/continue. My http://localhost:5005/conversations/default/parse is worked with get method post is not woking. But http://localhost:5005/conversations/default/continue is not work with any of get or post method.Below is my sample Django calling API
user_message = request.POST["text"]
response = requests.get("http://localhost:5005/conversations/default/parse",params={"query":user_message})
response = response.json()
next_action = response.get("next_action")
if next_action == "show_concert_reviews":
response_text = "show_concert_reviews"# "Sorry will get answer soon" #get_event(entities["day"],entities["time"],entities["place"])
elif next_action == "show_concert_reviews":
response_text = "show_concert_reviews1"
else:
response_text = "show_concert_reviews2"
print({"executed_action": next_action, "events": []})
response1 = requests.post("http://localhost:5005/conversations/default/continue",data={"executed_action":next_action, "events": []})
response1 = response1.json()
This give below error for http://localhost:5005/conversations/default/continue
Traceback (most recent call last):
File "/usr/local/lib/python3.5/site-packages/Twisted-16.2.0-py3.5.egg/twisted/web/server.py", line 234, in render
body = resrc.render(self)
File "/usr/local/lib/python3.5/site-packages/klein/resource.py", line 210, in render
d = defer.maybeDeferred(_execute)
File "/usr/local/lib/python3.5/site-packages/Twisted-16.2.0-py3.5.egg/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/usr/local/lib/python3.5/site-packages/klein/resource.py", line 204, in _execute
**kwargs)
--- <exception caught here> ---
File "/usr/local/lib/python3.5/site-packages/Twisted-16.2.0-py3.5.egg/twisted/internet/defer.py", line 150, in maybeDeferred
result = f(*args, **kw)
File "/usr/local/lib/python3.5/site-packages/klein/app.py", line 128, in execute_endpoint
return endpoint_f(self._instance, *args, **kwargs)
File "/usr/local/lib/python3.5/site-packages/klein/app.py", line 227, in _f
return _call(instance, f, request, *a, **kw)
File "/usr/local/lib/python3.5/site-packages/klein/app.py", line 50, in _call
result = f(*args, **kwargs)
File "/usr/local/lib/python3.5/site-packages/rasa_nlu/server.py", line 81, in decorated
return f(*args, **kwargs)
File "/usr/local/lib/python3.5/site-packages/rasa_core/server.py", line 126, in parse
request.content.read().decode('utf-8', 'strict'))
File "/usr/local/lib/python3.5/json/__init__.py", line 319, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.5/json/decoder.py", line 339, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.5/json/decoder.py", line 357, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0
And if i called this with get method it gave below 405 error 2018-03-08 23:44:11+0530 [-] "127.0.0.1" - - [08/Mar/2018:18:14:11 +0000] "GET /conversations/default/continue?executed_action=show_concert_reviews HTTP/1.1" 405 178 "-" "python-requests/2.18.4"
But this request works with curl.
Could you please help me out in this
@artiberde27
My server runs on python 3 For client i am using python 2.7 and below code snippet works for me ( both in parse and continue end points ):
s_url = 'http://localhost:5005/conversations/default/parse' payload = '{"query": "hi" }' response = urllib2.Request(s_url, payload, {'Content-Type': 'application/json'}) f = urllib2.urlopen(response) for x in f: resj = json.loads(x)
print(resj)
Hi.. all same here!.. with rasa core 0.9.0 :(
both rasa_nlu and rasa_core are trained!..
serve.py
from future import absolute_import from future import division from future import print_function from future import unicode_literals
import logging
from bot_server_channel import BotServerInputChannel
from rasa_core.channels import HttpInputChannel from rasa_core import utils from rasa_core.agent import Agent from rasa_core.interpreter import RasaNLUInterpreter from rasa_core.channels.channel import UserMessage from rasa_core.channels.direct import CollectingOutputChannel from rasa_core.channels.rest import HttpInputComponent from flask import Blueprint, request, jsonify
logger = logging.getLogger(name)
def preprocessor(message_text): text = message_text.strip() return text
def run():
interpreter = RasaNLUInterpreter("models/current/nlu_model","config.yml",lazy_init=False)
# path to your dialogues models
agent = Agent.load("models/current/dialogue", interpreter=interpreter)
channel = BotServerInputChannel(agent)
agent.handle_channel(channel,message_preprocessor=preprocessor)
return agent
if name == 'main':
utils.configure_colored_logging(loglevel="INFO")
run()
~
[root@ai nluEZ]# python serve.py
/usr/lib64/python2.7/site-packages/h5py/init.py:36: FutureWarning: Conversion of the second argument of issubdtype from float
to np.floating
is deprecated. In future, it will be treated as np.float64 == np.dtype(float).type
.
from ._conv import register_converters as _register_converters
2018-07-03 08:10:48.338250: I tensorflow/core/platform/cpu_feature_guard.cc:140] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2018-07-03 08:10:48 INFO tensorflow - Restoring parameters from models/current/nlu_model/intent_classifier_tensorflow_embedding.ckpt
Using TensorFlow backend.
2018-07-03 08:10:48 INFO root - Started http server on port 5005
2018-07-03 08:10:48+0700 [-] Log opened.
2018-07-03 08:10:48+0700 [-] Site starting on 5005
2018-07-03 08:10:48+0700 [-] Starting factory <twisted.web.server.Site instance at 0x7f36a24835a8>
======
[root@ai nluEZ]# curl -XPOST localhost:5005/conversations/default/parse -d "{"query":"hey"}" <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
The method is not allowed for the requested URL.
curl: (3) [globbing] unmatched close brace/bracket at pos 6 [root@ai nluEZ]#
please help me out from this problem
thanks a lot in advance
Regards Win
@Primtek try using this: curl -XPOST localhost:5005/conversations/default/parse -d "{\"query\":\"hey\"}"
use backslash with query and hey
@amanvars I tried this. I am getting this error:
!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
<title>404 Not Found</title>
<h1>Not Found</h1>
<p>The requested URL was not found on the server. If you entered the URL manually please check your spelling and try again.</p>
Please create a new issue
I think it has something to do with windows in insomnia send the post as a JSON body with content
{
"query": "Hi"
}
in curl give curl -X POST localhost:5005/conversations/default/respond -d "{\"query\":\"hi\"}" | python -mjson.tool
Hi, how can we integrate Cortana and Siri with RASA?
Rasa Core version: 0.8.2, Rasa NLU 0.11.3
Python version: 3.6, Anaconda version 5.0.1
Operating system (windows, osx, ...): Windows Server 2012 R2 Standard
Issue:
Hi all!
I followed the instructions to the letter on the documentation, but I get an error when trying to run Rasa Core HTTP Server. I know the server is working, when I go to localhost:5005, I get a message that says "hello from Rasa Core: 0.8.2" I can also see that the requests are being recieved by the server.
Steps I took (all using example data):
Whenever I try to test it I get this error. I also tried from another computer and the same issue presented itself. I ran anaconda prompt as admin. The server logs are empty or I would post them.
Server output: `
Curl:
Content of domain file (if used & relevant):