Downloaded a lora from civtai, put it in the lora folder, tried to load and it gives me this error.
Is there an existing issue for this?
[X] I have searched the existing issues
Reproduction
Load the LORA in WebUI
Screenshot
No response
Logs
Traceback (most recent call last):
File "C:\AI\text-generation-webui-main\installer_files\env\Lib\site-packages\gradio\queueing.py", line 566, in process_events
response = await route_utils.call_process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\text-generation-webui-main\installer_files\env\Lib\site-packages\gradio\route_utils.py", line 261, in call_process_api
output = await app.get_blocks().process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\text-generation-webui-main\installer_files\env\Lib\site-packages\gradio\blocks.py", line 1786, in process_api
result = await self.call_function(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\text-generation-webui-main\installer_files\env\Lib\site-packages\gradio\blocks.py", line 1350, in call_function
prediction = await utils.async_iteration(iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\text-generation-webui-main\installer_files\env\Lib\site-packages\gradio\utils.py", line 583, in async_iteration
return await iterator.__anext__()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\text-generation-webui-main\installer_files\env\Lib\site-packages\gradio\utils.py", line 576, in __anext__
return await anyio.to_thread.run_sync(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\text-generation-webui-main\installer_files\env\Lib\site-packages\anyio\to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\text-generation-webui-main\installer_files\env\Lib\site-packages\anyio\_backends\_asyncio.py", line 2144, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "C:\AI\text-generation-webui-main\installer_files\env\Lib\site-packages\anyio\_backends\_asyncio.py", line 851, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\AI\text-generation-webui-main\installer_files\env\Lib\site-packages\gradio\utils.py", line 559, in run_sync_iterator_async
return next(iterator)
^^^^^^^^^^^^^^
File "C:\AI\text-generation-webui-main\installer_files\env\Lib\site-packages\gradio\utils.py", line 742, in gen_wrapper
response = next(iterator)
^^^^^^^^^^^^^^
File "C:\AI\text-generation-webui-main\modules\ui_model_menu.py", line 270, in load_lora_wrapper
add_lora_to_model(selected_loras)
File "C:\AI\text-generation-webui-main\modules\LoRA.py", line 18, in add_lora_to_model
add_lora_transformers(lora_names)
File "C:\AI\text-generation-webui-main\modules\LoRA.py", line 120, in add_lora_transformers
params['dtype'] = shared.model.dtype
^^^^^^^^^^^^^^^^^^
AttributeError: 'LlamaCppModel' object has no attribute 'dtype'
Describe the bug
Downloaded a lora from civtai, put it in the lora folder, tried to load and it gives me this error.
Is there an existing issue for this?
Reproduction
Load the LORA in WebUI
Screenshot
No response
Logs
System Info