Closed iChristGit closed 2 months ago
Ugh. When did that happen? I'll try to look into it on the weekend. Until then maybe try a different language model that's not gated.
Ugh. When did that happen? I'll try to look into it on the weekend. Until then maybe try a different language model that's not gated.
I actually have the files already, just don't know where to put the files. Where does it store the Mistral model? Il just copy the files manually
I managed to fix the first issue by changing the model to MaziyarPanahi/Mistral-7B-Instruct-v0.1
But now getting this error after the web-ui loads, comfy ui is open alongside:
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "C:\Python\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 407, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "C:\Python\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 69, in __call__
return await self.app(scope, receive, send)
File "C:\Python\lib\site-packages\fastapi\applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\applications.py", line 123, in __call__
await self.middleware_stack(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\middleware\errors.py", line 186, in __call__
raise exc
File "C:\Python\lib\site-packages\starlette\middleware\errors.py", line 164, in __call__
await self.app(scope, receive, _send)
File "C:\Python\lib\site-packages\starlette\middleware\exceptions.py", line 65, in __call__
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
raise exc
File "C:\Python\lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "C:\Python\lib\site-packages\starlette\routing.py", line 756, in __call__
await self.middleware_stack(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\routing.py", line 776, in app
await route.handle(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\routing.py", line 297, in handle
await self.app(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
raise exc
File "C:\Python\lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "C:\Python\lib\site-packages\starlette\routing.py", line 72, in app
response = await func(request)
File "C:\Python\lib\site-packages\fastapi\routing.py", line 278, in app
raw_response = await run_endpoint_function(
File "C:\Python\lib\site-packages\fastapi\routing.py", line 193, in run_endpoint_function
return await run_in_threadpool(dependant.call, **values)
File "C:\Python\lib\site-packages\starlette\concurrency.py", line 42, in run_in_threadpool
return await anyio.to_thread.run_sync(func, *args)
File "C:\Python\lib\site-packages\anyio\to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "C:\Python\lib\site-packages\anyio\_backends\_asyncio.py", line 2144, in run_sync_in_worker_thread
return await future
File "C:\Python\lib\site-packages\anyio\_backends\_asyncio.py", line 851, in run
result = context.run(func, *args)
File "E:\AI-Apps\ai-alchemy\server.py", line 269, in image
description = get_image_description(cfg, card)
File "E:\AI-Apps\ai-alchemy\server.py", line 90, in get_image_description
return generate(messages, max_length=100)
File "E:\AI-Apps\ai-alchemy\server.py", line 48, in generate
generated_ids = generator.generate_batch(
RuntimeError: Library cublas64_12.dll is not found or cannot be loaded
INFO: 127.0.0.1:51196 - "GET /detective/image/Ms%20Scarlet HTTP/1.1" 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "C:\Python\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 407, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "C:\Python\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 69, in __call__
return await self.app(scope, receive, send)
File "C:\Python\lib\site-packages\fastapi\applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\applications.py", line 123, in __call__
await self.middleware_stack(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\middleware\errors.py", line 186, in __call__
raise exc
File "C:\Python\lib\site-packages\starlette\middleware\errors.py", line 164, in __call__
await self.app(scope, receive, _send)
File "C:\Python\lib\site-packages\starlette\middleware\exceptions.py", line 65, in __call__
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
raise exc
File "C:\Python\lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "C:\Python\lib\site-packages\starlette\routing.py", line 756, in __call__
await self.middleware_stack(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\routing.py", line 776, in app
await route.handle(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\routing.py", line 297, in handle
await self.app(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
raise exc
File "C:\Python\lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "C:\Python\lib\site-packages\starlette\routing.py", line 72, in app
response = await func(request)
File "C:\Python\lib\site-packages\fastapi\routing.py", line 278, in app
raw_response = await run_endpoint_function(
File "C:\Python\lib\site-packages\fastapi\routing.py", line 193, in run_endpoint_function
return await run_in_threadpool(dependant.call, **values)
File "C:\Python\lib\site-packages\starlette\concurrency.py", line 42, in run_in_threadpool
return await anyio.to_thread.run_sync(func, *args)
File "C:\Python\lib\site-packages\anyio\to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "C:\Python\lib\site-packages\anyio\_backends\_asyncio.py", line 2144, in run_sync_in_worker_thread
return await future
File "C:\Python\lib\site-packages\anyio\_backends\_asyncio.py", line 851, in run
result = context.run(func, *args)
File "E:\AI-Apps\ai-alchemy\server.py", line 269, in image
description = get_image_description(cfg, card)
File "E:\AI-Apps\ai-alchemy\server.py", line 90, in get_image_description
return generate(messages, max_length=100)
File "E:\AI-Apps\ai-alchemy\server.py", line 48, in generate
generated_ids = generator.generate_batch(
RuntimeError: Library cublas64_12.dll is not found or cannot be loaded
{'role': 'user', 'content': 'We are playing a game about a detective solving a case. What do we get if we combine Knife and Victim?'}
{'role': 'assistant', 'content': 'Stabbing.'}
{'role': 'user', 'content': 'What do we get if we combine Secret and Witness?'}
{'role': 'assistant', 'content': 'Blackmail.'}
{'role': 'user', 'content': 'What do we get if we combine Blood and Detective?'}
{'role': 'assistant', 'content': 'Clue.'}
{'role': 'user', 'content': 'What do we get if we combine Mr Jones and Mr James?'}
{'role': 'assistant', 'content': 'Enemies.'}
{'role': 'user', 'content': 'What do we get if we combine Detective and Ms Scarlet?'}
INFO: 127.0.0.1:51201 - "GET /elemental/info HTTP/1.1" 200 OK
{'role': 'user', 'content': 'We are playing a game about merging things. What do we get if we combine Fire and Water?'}
{'role': 'assistant', 'content': 'Steam.'}
{'role': 'user', 'content': 'What do we get if we combine Fire and City?'}
{'role': 'assistant', 'content': 'Fire station.'}
{'role': 'user', 'content': 'What do we get if we combine Superman and Batman?'}
{'role': 'assistant', 'content': 'Superbatman.'}
{'role': 'user', 'content': 'What do we get if we combine Human and Stone?'}
{'role': 'assistant', 'content': 'Dwarf.'}
{'role': 'user', 'content': 'What do we get if we combine Sea and Life?'}
{'role': 'assistant', 'content': 'Fish.'}
{'role': 'user', 'content': 'What do we get if we combine Blacksmith and Metal?'}
{'role': 'assistant', 'content': 'Axe.'}
{'role': 'user', 'content': 'What do we get if we combine Fire and Water?'}
For the model path: Maybe https://stackoverflow.com/questions/61798573/where-does-hugging-faces-transformers-save-models helps.
For cublas64_12.dll
: Maybe you're missing https://developer.nvidia.com/cuda-downloads?
For the model path: Maybe https://stackoverflow.com/questions/61798573/where-does-hugging-faces-transformers-save-models helps.
For
cublas64_12.dll
: Maybe you're missing https://developer.nvidia.com/cuda-downloads?
I managed to fix the first issue with changing the code to fetch MaziyarPanahi/Mistral-7B-Instruct-v0.1 I managed to also fix the cublas issue, I had CUDA 11.8, installed CUDA 12 and now it works.
But.. I get this error now:
INFO: 127.0.0.1:51785 - "GET /troops/image/Training HTTP/1.1" 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "C:\Python\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 407, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "C:\Python\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 69, in __call__
return await self.app(scope, receive, send)
File "C:\Python\lib\site-packages\fastapi\applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\applications.py", line 123, in __call__
await self.middleware_stack(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\middleware\errors.py", line 186, in __call__
raise exc
File "C:\Python\lib\site-packages\starlette\middleware\errors.py", line 164, in __call__
await self.app(scope, receive, _send)
File "C:\Python\lib\site-packages\starlette\middleware\exceptions.py", line 65, in __call__
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
raise exc
File "C:\Python\lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "C:\Python\lib\site-packages\starlette\routing.py", line 756, in __call__
await self.middleware_stack(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\routing.py", line 776, in app
await route.handle(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\routing.py", line 297, in handle
await self.app(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "C:\Python\lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
raise exc
File "C:\Python\lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "C:\Python\lib\site-packages\starlette\routing.py", line 72, in app
response = await func(request)
File "C:\Python\lib\site-packages\fastapi\routing.py", line 278, in app
raw_response = await run_endpoint_function(
File "C:\Python\lib\site-packages\fastapi\routing.py", line 193, in run_endpoint_function
return await run_in_threadpool(dependant.call, **values)
File "C:\Python\lib\site-packages\starlette\concurrency.py", line 42, in run_in_threadpool
return await anyio.to_thread.run_sync(func, *args)
File "C:\Python\lib\site-packages\anyio\to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "C:\Python\lib\site-packages\anyio\_backends\_asyncio.py", line 2144, in run_sync_in_worker_thread
return await future
File "C:\Python\lib\site-packages\anyio\_backends\_asyncio.py", line 851, in run
result = context.run(func, *args)
File "E:\AI-Apps\ai-alchemy\server.py", line 270, in image
imagedata = comfy.generate_image(cfg, f"{card}: {description}")
File "E:\AI-Apps\ai-alchemy\server.py", line 139, in generate_image
prompt_id = self.queue_prompt(prompt, client_id)["prompt_id"]
File "E:\AI-Apps\ai-alchemy\server.py", line 112, in queue_prompt
return json.loads(urllib.request.urlopen(req).read())
File "C:\Python\lib\urllib\request.py", line 216, in urlopen
return opener.open(url, data, timeout)
File "C:\Python\lib\urllib\request.py", line 525, in open
response = meth(req, response)
File "C:\Python\lib\urllib\request.py", line 634, in http_response
response = self.parent.error(
File "C:\Python\lib\urllib\request.py", line 563, in error
return self._call_chain(*args)
File "C:\Python\lib\urllib\request.py", line 496, in _call_chain
result = func(*args)
File "C:\Python\lib\urllib\request.py", line 643, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 400: Bad Request
I can see in comfy ui this error:
Failed to validate prompt for output 25:
* SDTurboScheduler 22:
- Required input is missing: denoise
Output will be ignored
invalid prompt: {'type': 'prompt_outputs_failed_validation', 'message': 'Prompt outputs failed validation', 'details': '', 'extra_info': {}}
Awesome! I'm not sure about prompt_outputs_failed_validation
. From what I can find, people seem to have this error when the ComfyUI workflow is trying to use a model checkpoint that is not downloaded. To check for that, you could try to look at the ComfyUI UI. Try loading the workflow from https://comfyanonymous.github.io/ComfyUI_examples/sdturbo/. If you can get that working, the copy in AI Alchemy should work too. It's basically the same.
Awesome! I'm not sure about
prompt_outputs_failed_validation
. From what I can find, people seem to have this error when the ComfyUI workflow is trying to use a model checkpoint that is not downloaded. To check for that, you could try to look at the ComfyUI UI. Try loading the workflow from https://comfyanonymous.github.io/ComfyUI_examples/sdturbo/. If you can get that working, the copy in AI Alchemy should work too. It's basically the same.
I downloaded the workflow and made sure I can actually generate images using sdxl-turbo, it works fine in comfy, but still I get this error when running ai-alchemy, the images wont generate.
Failed to validate prompt for output 25:
* SDTurboScheduler 22:
- Required input is missing: denoise
Output will be ignored
invalid prompt: {'type': 'prompt_outputs_failed_validation', 'message': 'Prompt outputs failed validation', 'details': '', 'extra_info': {}}
Maybe the way ComfyUI proccesses the prompt has changed, and we need to adjust the comfyui_workflow.json file?
Just to confirm, I did a clean install, and the gated repo can be fixed with MaziyarPanahi/Mistral-7B-Instruct-v0.1, but still the issue with ComfyUI asking for desnoise parameter is still present.. I would love to find a fix for it, as my kid is obsessed with this game haha
Thanks for the summary! I've managed to reproduce the issue with a fresh ComfyUI. I'll try to figure out a fix.
It's working!
You only really need this one line: https://github.com/darabos/ai-alchemy/commit/b2c3c957846f1d24eeb0a09f9b1f2ff67e61ce71#diff-dfce5e18b7315f6ad7d0c77f3224eccf23d4836afb6dc3eda674f66e2d5c351bR111
Or you can just update the whole repo. (I switched to a model that can be used without login too. I went for Mistral-7b 0.2. Maybe it's smarter than 0.1? I also tried Phi-3 but it was writing me an essay instead of giving a straight answer. 😄 )
I hope it will work for you now!
Can confirm its fixed with latest git pull! thank you so much!
One thing I noticed is that its now not role-playing and sometimes dont comply with the prompt aka:
Oh! Probably best go back to Mistral-7b 0.1 then...
Changing to Praise2112/Mistral-7B-Instruct-v0.1-int8-ct2 Seem to perform better, however a lot of re-generate needed because many times it generates the name with a lot of < / s >
Edit: I tried Llama-3-8B-Instruct (jncraton/Meta-Llama-3-8B-Instruct-ct2-int8) and it works perfectly!
I get this error when installing fresh on Windows, it seems like the repo has changed and you need to accept it before downloading, i can do it manually but where do I put the files from hugginface?