ttulttul / ComfyUI-Iterative-Mixer

Nodes that implement iterative mixing of samples to help with upscaling quality
GNU General Public License v3.0
115 stars 9 forks source link

Error during initialization #16

Closed larsupb closed 9 months ago

larsupb commented 9 months ago

First, thanks for your efforts, really appreciate.

Seems the latest changes cause errors with my torch (2.0.1) version. Looks like @torch.no_grad annotation causes this errors.

Traceback (most recent call last):
  File "/home/lars/SD/Apps/ComfyUI/nodes.py", line 1810, in load_custom_node
    module_spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/home/lars/SD/Apps/ComfyUI/custom_nodes/ComfyUI-Iterative-Mixer/__init__.py", line 1, in <module>
    from .nodes import NODE_CLASS_MAPPINGS, NODE_DISPLAY_NAME_MAPPINGS
  File "/home/lars/SD/Apps/ComfyUI/custom_nodes/ComfyUI-Iterative-Mixer/nodes.py", line 22, in <module>
    def calc_sigmas(model, sampler_name, scheduler, steps, start_at_step, end_at_step):
TypeError: no_grad.__init__() takes 1 positional argument but 2 were given

Cannot import /home/lars/SD/Apps/ComfyUI/custom_nodes/ComfyUI-Iterative-Mixer module for custom nodes: no_grad.__init__() takes 1 positional argument but 2 were given
nikolaiusa commented 9 months ago

+1

kulius7 commented 9 months ago

I also had this exact issue and got it to import by changing all occurences of @torch.no_grad to @torch.no_grad(), there should be 4 .py files to modify

ttulttul commented 9 months ago

Thank! I mistook no_grad for a decorator function when in fact it is a class, so you have to instantiate it when using it as a decorator. I will patch this up.

ttulttul commented 9 months ago

In Python, when you use a decorator that is a function, like @some_decorator, you do not need to add parentheses. However, when the decorator is a class or a function that returns a callable (like a class instance), you need to use parentheses, even if there are no arguments to pass.

In the case of @torch.no_grad(), it is used with parentheses because torch.no_grad is not a simple function. It is actually a class that creates a context manager object when called. This object temporarily sets the gradient calculation to off when its enter method is called and restores the original state when its exit method is called.

So, the parentheses are used to instantiate the torch.no_grad class, and the resulting object acts as a decorator. This is why you see @torch.no_grad() rather than @torch.no_grad.

In summary, the parentheses indicate that you are creating an instance of the torch.no_grad class (or calling a factory function), and it's this instance (or the result of the function call) that is acting as the decorator.

ttulttul commented 9 months ago

Fixed with the latest commit.