monarch-initiative / ontogpt

LLM-based ontological extraction tools, including SPIRES
https://monarch-initiative.github.io/ontogpt/
BSD 3-Clause "New" or "Revised" License
548 stars 68 forks source link

Running web-ontogpt yeilds an "Internal Server Error" #374

Closed ddooley closed 1 month ago

ddooley commented 1 month ago

I am able to get command line ontogpt to work but running "web-ontogpt", while it creates a form on localhost, yields a "ValueError: No client available because model source is unknown." error when I enter some text to analyze. I can't see any documentation on how to add a parameter to control which model is chosen, e.g. "web-ontogpt -m gpt-3.5-turbo" doesn't seem to do anything.

Last part of command line error reporting: ... File "/Applications/anaconda3/lib/python3.11/site-packages/ontogpt/engines/knowledge_engine.py", line 192, in __post_init__ raise ValueError("No client available because model source is unknown.")

caufieldjh commented 1 month ago

Hi @ddooley - I think there are several out-of-date parts in the web demo. I'll add a quick fix.

caufieldjh commented 1 month ago

OK, it should work now, at least with the version in this repo.

ddooley commented 1 month ago

Much obliged!

ddooley commented 1 month ago

Hmm, after pulling your latest changes, on master branch I'm still getting: File "/Applications/anaconda3/lib/python3.11/site-packages/ontogpt/engines/knowledge_engine.py", line 192, in __post_init__ raise ValueError("No client available because model source is unknown.") ValueError: No client available because model source is unknown.

I reran "pip install ontogpt[web]" but that didn't make a difference.

rkboyce commented 1 month ago

Hi - I experienced the same error:

$ web-ontogpt                 
WARNING:ontogpt.clients:llm_gpt4all module not found. GPT4All support will be disabled.
WARNING:ontogpt.engines.knowledge_engine:GPT4All client not available. GPT4All support will be disabled.
INFO:     Will watch for changes in these directories: ['/home/rdb20/DI_DIR']
INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     Started reloader process [308695] using WatchFiles
WARNING:ontogpt.clients:llm_gpt4all module not found. GPT4All support will be disabled.
WARNING:ontogpt.engines.knowledge_engine:GPT4All client not available. GPT4All support will be disabled.
INFO:     Started server process [308725]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     127.0.0.1:49630 - "GET / HTTP/1.1" 200 OK
INFO:     127.0.0.1:49630 - "GET /favicon.ico HTTP/1.1" 404 Not Found
Received request with schema drug.DrugMechanism
Received request with text Lisdexamfetamine is a prodrug of dextroamphetamine. Amphetamines are non-catecholamine sympathomimetic amines with CNS stimulant activity. The exact mode of therapeutic action in ADHD and BED is not known.

INFO:     127.0.0.1:52022 - "POST / HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/fastapi/applications.py", line 292, in __call__
    await super().__call__(scope, receive, send)
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
    raise e
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
    await self.app(scope, receive, send)
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/starlette/routing.py", line 66, in app
    response = await func(request)
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/fastapi/routing.py", line 273, in app
    raw_response = await run_endpoint_function(
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/fastapi/routing.py", line 192, in run_endpoint_function
    return await run_in_threadpool(dependant.call, **values)
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/starlette/concurrency.py", line 41, in run_in_threadpool
    return await anyio.to_thread.run_sync(func, *args)
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/anyio/to_thread.py", line 33, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
    return await future
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 807, in run
    result = context.run(func, *args)
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/ontogpt/webapp/main.py", line 55, in form_post
    engine = get_engine(datamodel)
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/ontogpt/webapp/main.py", line 40, in get_engine
    engines[datamodel] = SPIRESEngine(template_details=template_details)
  File "<string>", line 24, in __init__
  File "/home/rdb20/anaconda3/envs/kg_rag/lib/python3.10/site-packages/ontogpt/engines/knowledge_engine.py", line 192, in __post_init__
    raise ValueError("No client available because model source is unknown.")
ValueError: No client available because model source is unknown.
caufieldjh commented 1 month ago

This error should be fixed in the just-released ontogpt 0.3.12 - please reopen if the webapp still isn't working!

ddooley commented 1 month ago

Hmm, same result. When you say it works in "just-released ontogpt 0.3.12" does that mean that it should be fine working within the context of the updated local github main branch itself? I would presume so, but it isn't. Same error as above, and it reports out as 0.3.12

d.

caufieldjh commented 1 month ago

Strange! Yes, I expect this should be working with both the Pypi release of ontogpt 0.3.12 and the version on the main branch in this repo. If you are still seeing this error:

File "/Applications/anaconda3/lib/python3.11/site-packages/ontogpt/engines/knowledge_engine.py", line 192, in post_init
raise ValueError("No client available because model source is unknown.")

and it says the error is from line 192, then that's still out of date for some reason because that error is no longer at that line. If it is raising the same error from knowledge_engine.py, line 183 then it's still just the same root cause.

Do you get the same error if you run it as poetry run web-ontogpt versus web-ontogpt alone?

A more verbose logger output may also be helpful, so I'll add that option to the webapp runner.

ddooley commented 1 month ago

Yes still hapening on 192.

image

I don't know poetry at all; I tried your "poetry run web-ontogpt" but (after installing poetry) that errored out with "ModuleNotFoundError: No module named 'ontogpt' "

Maybe there's something basic. I Update to latest master on github. But the "/Applications/anaconda3/lib/python3.11/site-packages/ontogpt/engines/knowledge_engine.py" is not changed. How is the "/site-packages/ontogpt/engines/knowledge_engine.py reloaded to a new version?

caufieldjh commented 1 month ago

OK, that's definitely out of date then. I'd suggest avoiding poetry entirely and just doing a pip install ontogpt[web] - the updates are all in that version. May be some conflict going on between poetry and conda.

ddooley commented 1 month ago

So I did "pip install ontogpt[web]" but same error on line 192 results, so that knowledge_engine.py isn't getting replaced. Maybe it isn't registered with a new version id?

ddooley commented 1 month ago

In other words, which command line thing causes "/Applications/anaconda3/lib/python3.11/site-packages/ontogpt/engines/knowledge_engine.py" to be updated?

caufieldjh commented 1 month ago

Will a conda update ontogpt do it? It looks like the pip install isn't updating what's in your conda env.

ddooley commented 4 weeks ago

Hi, So the issue was probably caused by some kind of misconfiguration on my new computer visa vis old one that I transfered conda over from. Its working now, after a reinstall of conda!
Thanks for bearing with me!

caufieldjh commented 4 weeks ago

No worries! Glad there was a solution.