stavsap / comfyui-ollama

Apache License 2.0
388 stars 36 forks source link

AttributeError: 'Model' object has no attribute 'name' #70

Closed jerzysobski closed 1 week ago

jerzysobski commented 1 week ago

when using any of the nodes for Ollama, it is not picking up the models I have and leaving an error. I have gone through several installs trying to see if I could get it to work and none have been successful, Several of the methods I tried and none have been successful included: Fresh install using portable Version, Fresh Install under a COnda Environment (Several times). Under conda environment I even tried using different versions of Python in case it was a Python issue. None of the methods have worked and currently at a loss on what to try next. I have no issues using Ollama with many other tools just seems to be an issue with this set of nodes. Below is the error I get every time I run these models.

HTTP Request: GET http://127.0.0.1:11434/api/tags "HTTP/1.1 200 OK" Error handling request Traceback (most recent call last): File "C:\Users\jerzy\miniconda3\envs\comfyui_env\Lib\site-packages\aiohttp\web_protocol.py", line 478, in _handle_request resp = await request_handler(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\jerzy\miniconda3\envs\comfyui_env\Lib\site-packages\aiohttp\web_app.py", line 567, in _handle return await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\jerzy\miniconda3\envs\comfyui_env\Lib\site-packages\aiohttp\web_middlewares.py", line 117, in impl return await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI\server.py", line 63, in cache_control response: web.Response = await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI\server.py", line 141, in origin_only_middleware response = await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI\server.py", line 75, in cors_middleware response = await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI\custom_nodes\comfyui-ollama\CompfyuiOllama.py", line 19, in get_models_endpoint models = [model['name'] for model in client.list().get('models', [])] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI\custom_nodes\comfyui-ollama\CompfyuiOllama.py", line 19, in models = [model['name'] for model in client.list().get('models', [])]


  File "C:\Users\jerzy\miniconda3\envs\comfyui_env\Lib\site-packages\ollama\_types.py", line 20, in __getitem__
    return getattr(self, key)
           ^^^^^^^^^^^^^^^^^^
  File "C:\Users\jerzy\miniconda3\envs\comfyui_env\Lib\site-packages\pydantic\main.py", line 856, in __getattr__
    raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')
AttributeError: 'Model' object has no attribute 'name'
jerzysobski commented 1 week ago

1st message has error under Comfyui running under a Conda Environment. Below is Comfyui running in the portable version. (Different install - Fresh install).

To see the GUI go to: http://127.0.0.1:8188 FETCH DATA from: D:\AI\ComfyUI_SA\ComfyUI\custom_nodes\ComfyUI-Manager\extension-node-map.json [DONE] HTTP Request: GET http://127.0.0.1:11434/api/tags "HTTP/1.1 200 OK" Error handling request Traceback (most recent call last): File "D:\AI\ComfyUI_SA\python_embeded\Lib\site-packages\aiohttp\web_protocol.py", line 478, in _handle_request resp = await request_handler(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_SA\python_embeded\Lib\site-packages\aiohttp\web_app.py", line 567, in _handle return await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_SA\python_embeded\Lib\site-packages\aiohttp\web_middlewares.py", line 117, in impl return await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_SA\ComfyUI\server.py", line 63, in cache_control response: web.Response = await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_SA\ComfyUI\server.py", line 141, in origin_only_middleware response = await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_SA\ComfyUI\server.py", line 75, in cors_middleware response = await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_SA\ComfyUI\custom_nodes\comfyui-ollama\CompfyuiOllama.py", line 19, in get_models_endpoint models = [model['name'] for model in client.list().get('models', [])]


  File "D:\AI\ComfyUI_SA\python_embeded\Lib\site-packages\ollama\_types.py", line 20, in __getitem__
    return getattr(self, key)
           ^^^^^^^^^^^^^^^^^^
  File "D:\AI\ComfyUI_SA\python_embeded\Lib\site-packages\pydantic\main.py", line 896, in __getattr__
    raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')
AttributeError: 'Model' object has no attribute 'name'

![2024-11-22_12-45-05](https://github.com/user-attachments/assets/b35a8d06-7362-40a6-b0b6-75d72ef58ea9)
![2024-11-22_12-45-40](https://github.com/user-attachments/assets/2b8bd8b8-456c-4466-b6d6-c5eaf7c9f085)
YANG-Haruka commented 1 week ago

It seems that the latest ollama-python delete attribute "name". Use model['model'] replace model['name']

tlauts commented 1 week ago

It seems that the latest ollama-python delete attribute "name". Use model['model'] replace model['name']

I can verify that this change worked for me. Thanks !

Dartis2 commented 1 week ago

I'm having the same issue leading to the Ollama generate model being listed as 'undefined'. Despite having a functional Ollama model in my windows. How exactly do I use model['model'] replace model['name']? I don't even know where to begin with this.

tlauts commented 1 week ago

Edit the file ...\ComfyUI\custom_nodes\comfyui-ollama\CompfyuiOllama.py

Then restart.

I am not near my computer to give you the line number, but just search for " model['name'] "

Dartis2 commented 1 week ago

Thank you. For both the fix and the quick reply. You fixed a big headache. I'll be sure to name my first born after you.

jerzysobski commented 1 week ago

Thanks for the reply. I make those changes when I have a chance to get back to my computer.

stavsap commented 1 week ago

What ollama version are you experiencing this issue with? I have ollama v0.4.3, and I don't have any issues.

tlauts commented 1 week ago

What ollama version are you experiencing this issue with? I have ollama v0.4.3, and I don't have any issues.

Same, my ollama version is 0.4.3 (Windows). Greenfield deployment.

mofoni commented 1 week ago

What ollama version are you experiencing this issue with? I have ollama v0.4.3, and I don't have any issues.

i am running ollama version 0.4.4 (standard Windows installation, 3 days old) and i have the issue too.

the Error reported in the UI is the following:

OllamaGenerateAdvance
1 validation error for GenerateRequest
model
String should have at least 1 character [type=string_too_short, input_value='', input_type=str]
For further information visit https://errors.pydantic.dev/2.9/v/string_too_short

in the console i get this:

!!! Exception during processing !!! 1 validation error for GenerateRequest
model
  String should have at least 1 character [type=string_too_short, input_value='', input_type=str]
    For further information visit https://errors.pydantic.dev/2.9/v/string_too_short
Traceback (most recent call last):
  File "G:\ComfyUI\Installation\ComfyUI_windows_portable\ComfyUI\execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "G:\ComfyUI\Installation\ComfyUI_windows_portable\ComfyUI\execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "G:\ComfyUI\Installation\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
  File "G:\ComfyUI\Installation\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
  File "G:\ComfyUI\Installation\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-ollama\CompfyuiOllama.py", line 240, in ollama_generate_advance
    response = client.generate(model=model, system=system, prompt=prompt, context=context, options=options, keep_alive=str(keep_alive) + "m", format=format)
  File "G:\ComfyUI\Installation\ComfyUI_windows_portable\ComfyUI\venv\lib\site-packages\ollama\_client.py", line 243, in generate
    json=GenerateRequest(
  File "G:\ComfyUI\Installation\ComfyUI_windows_portable\ComfyUI\venv\lib\site-packages\pydantic\main.py", line 212, in __init__
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
pydantic_core._pydantic_core.ValidationError: 1 validation error for GenerateRequest
model
  String should have at least 1 character [type=string_too_short, input_value='', input_type=str]
    For further information visit https://errors.pydantic.dev/2.9/v/string_too_short

editing line 19 in CompfyuiOllama.py models = [model['name'] for model in client.list().get('models', [])] to models = [model['model'] for model in client.list().get('models', [])] and restarting ComfyUI resolves the issue.

stavsap commented 1 week ago

checking

stavsap commented 1 week ago

submitted a fix for it, please update and try.

mofoni commented 1 week ago

submitted a fix for it, please update and try.

it is working fine now, thanks for the quick fix.