Error occurred when executing BNK_CutoffRegionsToConditioning_ADV:
'SD1Tokenizer' object has no attribute 'tokenizer'
File "D:\test/ComfyUI\execution.py", line 153, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "D:\test/ComfyUI\execution.py", line 83, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "D:\test/ComfyUI\execution.py", line 76, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "D:\test\ComfyUI\custom_nodes\ComfyUI_Cutoff\cutoff.py", line 300, in finalize
return finalize_clip_regions(clip_regions, mask_token, strict_mask, start_from_masked, token_normalization, weight_interpretation)
File "D:\test\ComfyUI\custom_nodes\ComfyUI_Cutoff\cutoff.py", line 234, in finalize_clip_regions
mask_token = tokenizer.tokenizer(mask_token)['input_ids'][1:-1]
Any progress on this? I've been curious what this feature does but I don't know what past version of ComfyUI and/or Python stuff I'd need for this to work again.
Error occurred when executing BNK_CutoffRegionsToConditioning_ADV: