lks-ai / anynode

A Node for ComfyUI that does what you ask it to do
MIT License
478 stars 30 forks source link

ValueError: Expected metadata value to be a str, int, float or bool, got None which is a NoneType #44

Open XJF2332 opened 2 weeks ago

XJF2332 commented 2 weeks ago
got prompt
WARNING: AnyNode.IS_CHANGED() got an unexpected keyword argument 'prompt'

RUN-1 01-ai_Yi-1.5-9B-Chat-16K-8_0bpw_exl2 Take the input and multiply by 5 1 2

Finding Nodes in Workspace. kwargs: {'id': '1'}
{'prompt': 'Take the input and multiply by 5', 'function': 'An error occurred: API request failed with status code 500: Internal Server Error', 'imports': '', 'comment': None, 'input_types': "Type: <class 'int'>\nType: <class 'int'>", 'version': '0.1.2'}
!!! Exception during processing !!! Expected metadata value to be a str, int, float or bool, got None which is a NoneType
Traceback (most recent call last):
  File "C:\AI\StableDiffusion\Stable Diffusion ComfyUI Aki V1.2\execution.py", line 317, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "C:\AI\StableDiffusion\Stable Diffusion ComfyUI Aki V1.2\execution.py", line 192, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "C:\AI\StableDiffusion\Stable Diffusion ComfyUI Aki V1.2\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
  File "C:\AI\StableDiffusion\Stable Diffusion ComfyUI Aki V1.2\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
  File "C:\AI\StableDiffusion\Stable Diffusion ComfyUI Aki V1.2\custom_nodes\anynode\nodes\any.py", line 414, in go
    self.last_hash = registry.add_function(prompt, self.script, self.imports, self.last_comment, [variable_info(any), variable_info(any2)])
  File "C:\AI\StableDiffusion\Stable Diffusion ComfyUI Aki V1.2\custom_nodes\anynode\nodes\util_functions.py", line 57, in add_function
    self.add_function_to_chromadb(prompt_hash, prompt, function_code, imports, comment, input_types)
  File "C:\AI\StableDiffusion\Stable Diffusion ComfyUI Aki V1.2\custom_nodes\anynode\nodes\util_functions.py", line 86, in add_function_to_chromadb
    collection.add(
  File "C:\AI\StableDiffusion\Stable Diffusion ComfyUI Aki V1.2\python\lib\site-packages\chromadb\api\models\Collection.py", line 80, in add
    ) = self._validate_and_prepare_embedding_set(
  File "C:\AI\StableDiffusion\Stable Diffusion ComfyUI Aki V1.2\python\lib\site-packages\chromadb\api\models\CollectionCommon.py", line 271, in _validate_and_prepare_embedding_set
    ) = self._validate_embedding_set(
  File "C:\AI\StableDiffusion\Stable Diffusion ComfyUI Aki V1.2\python\lib\site-packages\chromadb\api\models\CollectionCommon.py", line 182, in _validate_embedding_set
    validate_metadatas(maybe_cast_one_to_many_metadata(metadatas))
  File "C:\AI\StableDiffusion\Stable Diffusion ComfyUI Aki V1.2\python\lib\site-packages\chromadb\api\types.py", line 336, in validate_metadatas
    validate_metadata(metadata)
  File "C:\AI\StableDiffusion\Stable Diffusion ComfyUI Aki V1.2\python\lib\site-packages\chromadb\api\types.py", line 302, in validate_metadata
    raise ValueError(
ValueError: Expected metadata value to be a str, int, float or bool, got None which is a NoneType

Prompt executed in 0.01 seconds

and this is my workflow.it is a simple one created for testing. image

local llm was loaded. please ignore last error, it was not caused by anynode. image

API URL was right, and there was no API key. image

Dibucci commented 1 week ago

I'm getting this same issue, only im using OpenAI but everything is pretty much the same, my API key is in the env file, but no matter what i put into the node, i get the same output error like you have

fengzeyuchen commented 3 days ago

I also encountered the same error, and there was a rejection from the server, I tried to replace localhost with 127.0.0.1, and also tried to add/or /v1 at the end, but in fact, it is not useful, but there is no problem with the LLM service, I tested in the application and it can return normally In addition, the local LLM node can not accept the input of pictures or videos, and it can not directly process the pictures as demonstrated. It has been tested for a long time, but it is a pity that it did not succeed. If there is any latest progress, please remember @me, thank you