DevXT-LLC / ezlocalai

ezlocalai is an easy to set up local artificial intelligence server with OpenAI Style Endpoints.
MIT License
72 stars 13 forks source link

Getting error with example request for /v1/chat/completions #9

Closed Daksh closed 8 months ago

Daksh commented 8 months ago

I opened http://localhost:8091/ in the web browser and ran the example request, it gave a 500: Internal Server Error. When I checked the docker logs, it gives TypeError: 'bool' object is not callable. Are others facing this issue too?

I am using Mistral-7B-OpenOrca

Request body:

{
  "model": "Mistral-7B-OpenOrca",
  "messages": [
    {}
  ],
  "temperature": 0.9,
  "top_p": 1,
  "functions": [
    {}
  ],
  "function_call": "string",
  "n": 1,
  "stream": false,
  "stop": [
    "string"
  ],
  "max_tokens": 8192,
  "presence_penalty": 0,
  "frequency_penalty": 0,
  "logit_bias": {
    "additionalProp1": 0,
    "additionalProp2": 0,
    "additionalProp3": 0
  },
  "user": "string"
}

Error:

INFO:     192.168.65.1:40258 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/uvicorn/protocols/http/h11_impl.py", line 408, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/usr/local/lib/python3.10/dist-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
  File "/usr/local/lib/python3.10/dist-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/usr/local/lib/python3.10/dist-packages/starlette/applications.py", line 116, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.10/dist-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/usr/local/lib/python3.10/dist-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/usr/local/lib/python3.10/dist-packages/starlette/middleware/exceptions.py", line 62, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/usr/local/lib/python3.10/dist-packages/starlette/_exception_handler.py", line 55, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.10/dist-packages/starlette/_exception_handler.py", line 44, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 746, in __call__
    await route.handle(scope, receive, send)
  File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 288, in handle
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 75, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/usr/local/lib/python3.10/dist-packages/starlette/_exception_handler.py", line 55, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.10/dist-packages/starlette/_exception_handler.py", line 44, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 70, in app
    response = await func(request)
  File "/usr/local/lib/python3.10/dist-packages/fastapi/routing.py", line 299, in app
    raise e
  File "/usr/local/lib/python3.10/dist-packages/fastapi/routing.py", line 294, in app
    raw_response = await run_endpoint_function(
  File "/usr/local/lib/python3.10/dist-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
  File "/app/app.py", line 91, in chat_completions
    return LLM(**c.model_dump()).chat(messages=c.messages)
  File "/app/local_llm/__init__.py", line 344, in chat
    data = self.generate(prompt=prompt)
  File "/app/local_llm/__init__.py", line 310, in generate
    formatted_prompt = format_prompt(
TypeError: 'bool' object is not callable
Daksh commented 8 months ago

It was not working for me with the latest image, but it does work with cpu-f99ed54d76d3752a2af75979dd06f3ef52247a6d

Josh-XT commented 8 months ago

I believe I have this resolved now. I changed the docker build strategy and had a few things to fix from that. It should be working at this time on the latest version.

Navigate to your Local-LLM folder then follow the instructions below for updating:

To update CPU version:

git pull
docker-compose pull

To update the Cuda version:

git pull
docker-compose -f docker-compose-cuda.yml pull
Daksh commented 8 months ago

I still seem to be getting an error on 8874b95229c9476bc7d42cf488e7051b164d562b, ctypes.ArgumentError: argument 5: TypeError: expected LP_c_float instance instead of float

Daksh commented 8 months ago

Do you have any particular version numbers/tags that are more stable than the others?

Josh-XT commented 8 months ago

The current latest version is working right now. A dependency updating broke something over night, I forced it to a previous version and things appear to be working currently. I'll update it later once the bug is resolved.