Open adamsah opened 7 months ago
Reproduce Issue Code
import openai
openai.api_key = "EMPTY" # Not support yet
openai.api_base = "https://<url>:8443/v1"
model = "Baichuan2-13B-Chat"
prompt = "hello"
# create a completion
completion = openai.Completion.create(model=model, prompt=prompt, max_tokens=64)
# print the completion
print(prompt + completion.choices[0].text)
# Test Embedding
embedding = openai.Embedding.create(model=model, input="Hello world!")
print(f"embedding len: {len(embedding['data'][0]['embedding'])}")
print(f"embedding value[:5]: {embedding['data'][0]['embedding'][:5]}")
请问你解决了吗?我用chatglm3-6b也遇到了同样的问题
I have the same issue occasionally, any solutions? Here is my exception traceback:
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py", line 408, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
return await self.app(scope, receive, send)
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/fastapi/applications.py", line 1106, in __call__
await super().__call__(scope, receive, send)
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/starlette/applications.py", line 122, in __call__
await self.middleware_stack(scope, receive, send)
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/starlette/middleware/errors.py", line 184, in __call__
raise exc
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/starlette/middleware/errors.py", line 162, in __call__
await self.app(scope, receive, _send)
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/starlette/middleware/cors.py", line 83, in __call__
await self.app(scope, receive, send)
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
raise exc
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
await self.app(scope, receive, sender)
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
raise e
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
await self.app(scope, receive, send)
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/starlette/routing.py", line 718, in __call__
await route.handle(scope, receive, send)
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/starlette/routing.py", line 276, in handle
await self.app(scope, receive, send)
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/starlette/routing.py", line 66, in app
response = await func(request)
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/fastapi/routing.py", line 274, in app
raw_response = await run_endpoint_function(
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
File "/home/ma-user/anaconda3/envs/fschat/lib/python3.9/site-packages/fastchat/serve/openai_api_server.py", line 547, in create_completion
if content["error_code"] != 0:
TypeError: string indices must be integers
+1
+1
Faced similar issue. A quick fix is to convert the string to a dict. Inside open_api_server.py
import ast
After line number 434, add line:
content = ast.literal_eval(content)
any one resolved this problem?
使用ChatGLM3-6B一开始也遇到这个问题,后来排查发现ChatGLM3有自己的提示词格式,示例中的prompt = "hello"
应该不符合ChatGLM3的提示词格式,用<|system|>xxx<|user|>xxx
改造了一下,不报这个错了。。。
Use the Python openai to call the FastChat API + baichuan2-13B, then I received the "TypeError: string indices must be integers" error
The full error message: """ Traceback (most recent call last): File "C:\Users\Adams\anaconda3\lib\site-packages\openai\api_requestor.py", line 413, in handle_error_response error_data = resp["error"] TypeError: string indices must be integers
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "D:/Programs/20230927/LLM/api_sample_test.py", line 13, in
completion = openai.Completion.create(model=model, prompt=prompt, max_tokens=64)
File "C:\Users\Adams\anaconda3\lib\site-packages\openai\api_resources\completion.py", line 25, in create
return super().create(*args, **kwargs)
File "C:\Users\Adams\anaconda3\lib\site-packages\openai\api_resources\abstract\engine_apiresource.py", line 155, in create
response, , api_key = requestor.request(
File "C:\Users\Adams\anaconda3\lib\site-packages\openai\api_requestor.py", line 299, in request
resp, got_stream = self._interpret_response(result, stream)
File "C:\Users\Adams\anaconda3\lib\site-packages\openai\api_requestor.py", line 710, in _interpret_response
self._interpret_response_line(
File "C:\Users\Adams\anaconda3\lib\site-packages\openai\api_requestor.py", line 775, in _interpret_response_line
raise self.handle_error_response(
File "C:\Users\Adams\anaconda3\lib\site-packages\openai\api_requestor.py", line 415, in handle_error_response
raise error.APIError(
openai.error.APIError: Invalid response object from API: 'Internal Server Error' (HTTP response code was 500)
Process finished with exit code 1 """