Open Ko-Ko-Kirk opened 1 week ago
I fixed it and create a PR here: https://github.com/langchain-ai/langchain/pull/26683
Any updates on this? I'm having the same issue.
Any updates on this? I'm having the same issue.
You can use my PR. e.g. in pyproject.toml, modify it to langchain-community = { git = "https://github.com/Ko-Ko-Kirk/langchain.git", subdirectory = "libs/community", branch = "fix/serverless-api-400-bug" }
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Traceback (most recent call last): File "/Users/koko/Desktop/programming/xx/aml/aml/day10.py", line 33, in
response = llm.invoke("Hello")
^^^^^^^^^^^^^^^^^^^
File "/Users/koko/Desktop/programming/xx/aml/.venv/lib/python3.11/site-packages/langchain_core/language_models/llms.py", line 391, in invoke
self.generate_prompt(
File "/Users/koko/Desktop/programming/xx/aml/.venv/lib/python3.11/site-packages/langchain_core/language_models/llms.py", line 756, in generate_prompt
return self.generate(prompt_strings, stop=stop, callbacks=callbacks, *kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/koko/Desktop/programming/xx/aml/.venv/lib/python3.11/site-packages/langchain_core/language_models/llms.py", line 950, in generate
output = self._generate_helper(
^^^^^^^^^^^^^^^^^^^^^^
File "/Users/koko/Desktop/programming/xx/aml/.venv/lib/python3.11/site-packages/langchain_core/language_models/llms.py", line 793, in _generate_helper
raise e
File "/Users/koko/Desktop/programming/xx/aml/.venv/lib/python3.11/site-packages/langchain_core/language_models/llms.py", line 780, in _generate_helper
self._generate(
File "/Users/koko/Desktop/programming/xx/aml/.venv/lib/python3.11/site-packages/langchain_community/llms/azureml_endpoint.py", line 544, in _generate
response_payload = self.http_client.call(
^^^^^^^^^^^^^^^^^^^^^^
File "/Users/koko/Desktop/programming/xx/aml/.venv/lib/python3.11/site-packages/langchain_community/llms/azureml_endpoint.py", line 57, in call
response = urllib.request.urlopen(
^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/urllib/request.py", line 216, in urlopen
return opener.open(url, data, timeout)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/urllib/request.py", line 525, in open
response = meth(req, response)
^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/urllib/request.py", line 634, in http_response
response = self.parent.error(
^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/urllib/request.py", line 563, in error
return self._call_chain(args)
^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/urllib/request.py", line 496, in _call_chain
result = func(*args)
^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/urllib/request.py", line 643, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 400: Bad Request
Description
I am trying to use AzureMLOnlineEndpoint with serverless deployment(llama 3.1 8B instruct), I expect it run successfully, but I got bad request 400. I use cURL to request, and I got the answer successfully. Here is my cURL:
I traced the code in azureml_endpoint.py, and I found the request body is totally wrong. In serverless api version, it will send request like
{"prompt": "Hello"}
, which is not match the format above. You can check the code in azureml_endpoint.py around line 294-295.By the way the
format_response_payload
is wrong as well. The response via cURL is:There is no "text" column in the response json, but in azureml_endpoint.py around line 326-327, it parses "text" column.
System Info
System Information
Package Information
Optional packages not installed
Other Dependencies