invoke-ai / InvokeAI

Invoke is a leading creative engine for Stable Diffusion models, empowering professionals, artists, and enthusiasts to generate and create visual media using the latest AI-driven technologies. The solution offers an industry leading WebUI, and serves as the foundation for multiple commercial products.
https://invoke-ai.github.io/InvokeAI/
Apache License 2.0
23.34k stars 2.4k forks source link

[bug]: FastAPI is broken in main #3038

Closed ebr closed 1 year ago

ebr commented 1 year ago

Is there an existing issue for this?

OS

Linux

GPU

cuda

VRAM

No response

What version did you experience this issue on?

17d8bbf3

What happened?

Nodes API is broken for me on main (@ 17d8bbf3 as of right now). Any request returns a 500 type error. Attempt to get the docs results in "Failed to load API definition" (see screenshot).

This is on a fresh .venv.

Stack trace:

INFO:     Started server process [1532906]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:9090 (Press CTRL+C to quit)
* Initializing, be patient...
>> Internet connectivity is True
>> InvokeAI, version 3.0.0+a0
>> InvokeAI runtime directory is "/tmp/inv"
## NOT FOUND: GFPGAN model not found at /tmp/inv/models/gfpgan/GFPGANv1.4.pth
>> GFPGAN Disabled
## NOT FOUND: CodeFormer model not found at /tmp/inv/models/codeformer/codeformer.pth
>> CodeFormer Disabled
INFO:     192.168.200.130:37622 - "GET /docs HTTP/1.1" 200 OK
INFO:     192.168.200.130:37622 - "GET /openapi.json HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 436, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/fastapi/applications.py", line 276, in __call__
    await super().__call__(scope, receive, send)
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/starlette/middleware/cors.py", line 84, in __call__
    await self.app(scope, receive, send)
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/fastapi_events/middleware.py", line 43, in __call__
    await self.app(scope, receive, send)
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
    raise e
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
    await self.app(scope, receive, send)
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/starlette/routing.py", line 66, in app
    response = await func(request)
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/fastapi/applications.py", line 231, in openapi
    return JSONResponse(self.openapi())
  File "/home/clipclop/Code/invokeai/invokeai/app/api_app.py", line 85, in custom_openapi
    openapi_schema = get_openapi(
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/fastapi/openapi/utils.py", line 423, in get_openapi
    definitions = get_model_definitions(
  File "/home/clipclop/Code/invokeai/.venv/lib/python3.10/site-packages/fastapi/utils.py", line 49, in get_model_definitions
    model_name = model_name_map[model]
KeyError: <class 'pydantic.main.CollectInvocation'>

Curiously, the KeyError refers to different classes between runs.

I tracked this down to the commit that pins fastapi and friends to newer versions: bc347f74. Checking out any commit prior to that (and rebuilding the .venv) restores normal behaviour.

Screenshots

image

Additional context

No response

Contact Details

No response

thienbaogithub commented 1 year ago

my IvokeAi is working on macos, but error:

2023-09-04 22:43:29,715]::[InvokeAI]::INFO --> Patchmatch initialized [2023-09-04 22:43:30,071]::[uvicorn.error]::INFO --> Started server process [1225] [2023-09-04 22:43:30,071]::[uvicorn.error]::INFO --> Waiting for application startup. [2023-09-04 22:43:30,072]::[InvokeAI]::INFO --> InvokeAI version 3.1.0 [2023-09-04 22:43:30,072]::[InvokeAI]::INFO --> Root directory = /Users/thienbao/Downloads/InvokeAI-Installer [2023-09-04 22:43:30,073]::[InvokeAI]::INFO --> GPU device = mps [2023-09-04 22:43:30,089]::[InvokeAI]::INFO --> Scanning /Users/thienbao/Downloads/InvokeAI-Installer/models for new models [2023-09-04 22:43:30,202]::[InvokeAI]::INFO --> Scanned 7 files and directories, imported 0 models [2023-09-04 22:43:30,213]::[InvokeAI]::INFO --> Model manager service initialized [2023-09-04 22:43:30,215]::[uvicorn.error]::INFO --> Application startup complete. [2023-09-04 22:43:30,216]::[uvicorn.error]::INFO --> Uvicorn running on http://127.0.0.1:9090 (Press CTRL+C to quit)