open-mmlab / mmagic

OpenMMLab Multimodal Advanced, Generative, and Intelligent Creation Toolbox. Unlock the magic 🪄: Generative-AI (AIGC), easy-to-use APIs, awsome model zoo, diffusion models, for text-to-image generation, image/video restoration/enhancement, etc.
https://mmagic.readthedocs.io/en/latest/
Apache License 2.0
6.95k stars 1.06k forks source link

[Bug] cyclegan:UnboundLocalError: local variable 'init_info' referenced before assignment #2109

Open xvjiawen opened 10 months ago

xvjiawen commented 10 months ago

Prerequisite

Task

I have modified the scripts/configs, or I'm working on my own tasks/models/datasets.

Branch

main branch https://github.com/open-mmlab/mmagic

Environment

System environment: sys.platform: linux Python: 3.8.17 (default, Jul 5 2023, 21:04:15) [GCC 11.2.0] CUDA available: True numpy_random_seed: 2022 GPU 0,1,2,3,4,5,6,7,8,9: NVIDIA GeForce RTX 2080 Ti CUDA_HOME: /usr/local/cuda-11.7 NVCC: Cuda compilation tools, release 11.7, V11.7.99 GCC: gcc (Ubuntu 11.3.0-1ubuntu1~22.04.1) 11.3.0 PyTorch: 2.0.1+cu117 PyTorch compiling details: PyTorch built with:

Runtime environment: cudnn_benchmark: True mp_cfg: {'mp_start_method': 'fork', 'opencv_num_threads': 0} dist_cfg: {'backend': 'nccl'} seed: 2022 diff_rank_seed: True Distributed launcher: none Distributed training: False GPU number: 1

Reproduces the problem - code sample

def init_func(m):
    """Initialization function.

    Args:
        m (nn.Module): Module to be initialized.
    """
    classname = m.__class__.__name__
    if hasattr(m, 'weight') and (classname.find('Conv') != -1
                                 or classname.find('Linear') != -1):
        if init_type == 'normal':
            normal_init(m, 0.0, init_gain)
        elif init_type == 'xavier':
            xavier_init(m, gain=init_gain, distribution='normal')
        elif init_type == 'kaiming':
            kaiming_init(
                m,
                a=0,
                mode='fan_in',
                nonlinearity='leaky_relu',
                distribution='normal')
        elif init_type == 'orthogonal':
            init.orthogonal_(m.weight, gain=init_gain)
            init.constant_(m.bias.data, 0.0)
        else:
            raise NotImplementedError(
                f"Initialization method '{init_type}' is not implemented")
        init_info = (f'Initialize {m.__class__.__name__} by \'init_type\' '
                     f'{init_type}.')
    elif classname.find('BatchNorm2d') != -1:
        # BatchNorm Layer's weight is not a matrix;
        # only normal distribution applies.
        normal_init(m, 1.0, init_gain)
        init_info = (f'{m.__class__.__name__} is BatchNorm2d, initialize '
                     'by Norm initialization with mean=1, '
                     f'std={init_gain}')

    if hasattr(m, '_params_init_info'):
        update_init_info(module, init_info)

module.apply(init_func)

Reproduces the problem - command or script

./tools/dist_train.sh ./configs/cyclegan/cyclegan_lsgan-id0-resnet-in_1xb1-270kiters_hk2flir.py 8 --work-dir ./work_dirs/demo

Reproduces the problem - error message

Traceback (most recent call last): File "/data/xjw/share/projects/mmagic/tools/train.py", line 114, in main() File "/data/xjw/share/projects/mmagic/tools/train.py", line 107, in main runner.train() File "/root/miniconda3/lib/python3.8/site-packages/mmengine/runner/runner.py", line 1723, in train self._init_model_weights() File "/root/miniconda3/lib/python3.8/site-packages/mmengine/runner/runner.py", line 906, in _init_model_weights model.init_weights() File "/data/xjw/share/projects/mmagic/mmagic/models/base_models/base_translation_model.py", line 110, in init_weights gen.init_weights() File "/data/xjw/share/projects/mmagic/mmagic/models/editors/cyclegan/cyclegan_generator.py", line 139, in init_weights generation_init_weights( File "/data/xjw/share/projects/mmagic/mmagic/models/utils/model_utils.py", line 146, in generation_init_weights module.apply(init_func) File "/root/miniconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 885, in apply fn(self) File "/data/xjw/share/projects/mmagic/mmagic/models/utils/model_utils.py", line 144, in init_func update_init_info(module, init_info) UnboundLocalError: local variable 'init_info' referenced before assignment

Additional information

I ONLY modify the dataset which the format is the same as hourse2zerba

zhangliukun commented 9 months ago

I have the same errors, it seems like the "if elif" not run leading to the problem.

fsbarros98 commented 8 months ago

any updates on this? this seems to work with MMGeneration repository...

TolgaOzdmir commented 2 months ago

I ran into the same error and tried a quick fix by changing the code below, but I know it's probably not the right solution.

`def init_func(m): """Initialization function.

    Args:
        m (nn.Module): Module to be initialized.
    """
    classname = m.__class__.__name__
    init_info = ""
    if hasattr(m, 'weight') and (classname.find('Conv') != -1
                                 or classname.find('Linear') != -1):
        if init_type == 'normal':
            normal_init(m, 0.0, init_gain)
        elif init_type == 'xavier':
            xavier_init(m, gain=init_gain, distribution='normal')
        elif init_type == 'kaiming':
            kaiming_init(
                m,
                a=0,
                mode='fan_in',
                nonlinearity='leaky_relu',
                distribution='normal')
        elif init_type == 'orthogonal':
            init.orthogonal_(m.weight, gain=init_gain)
            init.constant_(m.bias.data, 0.0)
        else:
            raise NotImplementedError(
                f"Initialization method '{init_type}' is not implemented")
        init_info = (f'Initialize {m.__class__.__name__} by \'init_type\' '
                     f'{init_type}.')
    elif classname.find('BatchNorm2d') != -1:
        # BatchNorm Layer's weight is not a matrix;
        # only normal distribution applies.
        normal_init(m, 1.0, init_gain)
        init_info = (f'{m.__class__.__name__} is BatchNorm2d, initialize '
                     'by Norm initialization with mean=1, '
                     f'std={init_gain}')

    if hasattr(m, '_params_init_info'):
        update_init_info(module, init_info)

module.apply(init_func)`