Closed xs818818 closed 1 year ago
你好,我们修复了这个bug,你再试下。
client可以了,但是openai接口还是不行。 curl http://0.0.0.0:8000/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-3.5-turbo", "messages": [{"role": "user", "content": "你是谁"}] }'
不过CPU确实已经可以运行了,感谢,希望企业微信可以回复我让我加群
openai chat的我们修复了下
我修改model_path = "tigerbot-7b-chat" model = AutoModelForCausalLM.from_pretrained(model_path, torch_dtype=torch.bfloat16, device_map='cpu') 使用client.py得到以下错误 ERROR: Exception in ASGI application Traceback (most recent call last): File "/home/xs/.local/lib/python3.8/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi result = await app( # type: ignore[func-returns-value] File "/home/xs/.local/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call return await self.app(scope, receive, send) File "/home/xs/.local/lib/python3.8/site-packages/fastapi/applications.py", line 289, in call await super().call(scope, receive, send) File "/home/xs/.local/lib/python3.8/site-packages/starlette/applications.py", line 122, in call await self.middleware_stack(scope, receive, send) File "/home/xs/.local/lib/python3.8/site-packages/starlette/middleware/errors.py", line 184, in call raise exc File "/home/xs/.local/lib/python3.8/site-packages/starlette/middleware/errors.py", line 162, in call await self.app(scope, receive, _send) File "/home/xs/.local/lib/python3.8/site-packages/starlette/middleware/cors.py", line 83, in call await self.app(scope, receive, send) File "/home/xs/.local/lib/python3.8/site-packages/starlette/middleware/exceptions.py", line 79, in call raise exc File "/home/xs/.local/lib/python3.8/site-packages/starlette/middleware/exceptions.py", line 68, in call await self.app(scope, receive, sender) File "/home/xs/.local/lib/python3.8/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in call raise e File "/home/xs/.local/lib/python3.8/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in call await self.app(scope, receive, send) File "/home/xs/.local/lib/python3.8/site-packages/starlette/routing.py", line 718, in call await route.handle(scope, receive, send) File "/home/xs/.local/lib/python3.8/site-packages/starlette/routing.py", line 276, in handle await self.app(scope, receive, send) File "/home/xs/.local/lib/python3.8/site-packages/starlette/routing.py", line 69, in app await response(scope, receive, send) File "/home/xs/miniconda3/envs/tigerbot/lib/python3.8/site-packages/sse_starlette/sse.py", line 247, in call await wrap(partial(self.listen_for_disconnect, receive)) File "/home/xs/.local/lib/python3.8/site-packages/anyio/_backends/_asyncio.py", line 597, in aexit raise exceptions[0] File "/home/xs/miniconda3/envs/tigerbot/lib/python3.8/site-packages/sse_starlette/sse.py", line 236, in wrap await func() File "/home/xs/miniconda3/envs/tigerbot/lib/python3.8/site-packages/sse_starlette/sse.py", line 221, in stream_response async for data in self.body_iterator: File "./apps/api.py", line 144, in event_generator for message in model.stream_chat( File "/home/xs/miniconda3/envs/tigerbot/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1614, in getattr raise AttributeError("'{}' object has no attribute '{}'".format( AttributeError: 'LlamaForCausalLM' object has no attribute 'stream_chat'