kijai / ComfyUI-Florence2

Inference Microsoft Florence2 VLM
MIT License
768 stars 51 forks source link

[Error] Florence2Run node return "tuple index out of range" #83

Closed koheejs closed 1 month ago

koheejs commented 1 month ago

image

image

Error Details

Node Type: Florence2Run Exception Type: IndexError Exception Message: tuple index out of range

System Information

ComfyUI Version: v0.2.2-93-g8dfa0cc Arguments: main.py OS: posix Python Version: 3.10.15 | packaged by conda-forge | (main, Sep 30 2024, 17:48:38) [Clang 17.0.6] Embedded Python: false PyTorch Version: 2.3.1

Devices

Name: mps Type: mps VRAM Total: 34359738368 VRAM Free: 13951074304 Torch VRAM Total: 34359738368 Torch VRAM Free: 13951074304

Traceback

Traceback (most recent call last):
  File "root/app/execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "root/app/execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "root/app/execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
  File "root/app/execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
  File "root/app/custom_nodes/ComfyUI-Florence2/nodes.py", line 296, in encode
    generated_ids = model.generate(
  File "root/app/env/lib/python3.10/site-packages/peft/peft_model.py", line 1704, in generate
    outputs = self.base_model.generate(*args, **kwargs)
  File "root/cache/HF_HOME/modules/transformers_modules/Florence-2-base/modeling_florence2.py", line 2796, in generate
    return self.language_model.generate(
  File "root/app/env/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "root/app/env/lib/python3.10/site-packages/transformers/generation/utils.py", line 1829, in generate
    self._prepare_special_tokens(generation_config, kwargs_has_attention_mask, device=device)
  File "root/app/env/lib/python3.10/site-packages/transformers/generation/utils.py", line 1678, in _prepare_special_tokens
    and isin_mps_friendly(elements=eos_token_tensor, test_elements=pad_token_tensor).any()
  File "root/app/env/lib/python3.10/site-packages/transformers/pytorch_utils.py", line 325, in isin_mps_friendly
    return elements.tile(test_elements.shape[0], 1).eq(test_elements.unsqueeze(1)).sum(dim=0).bool().squeeze()
IndexError: tuple index out of range
koheejs commented 1 month ago

This issue elated to https://github.com/huggingface/transformers/issues/33786