ajbrock / BigGAN-PyTorch

The author's officially unofficial PyTorch BigGAN implementation.
MIT License
2.87k stars 476 forks source link

Undefined name 'self' in layers.py #3

Closed cclauss closed 5 years ago

cclauss commented 5 years ago

flake8 testing of https://github.com/ajbrock/BigGAN-PyTorch on Python 3.7.1

$ flake8 . --count --select=E9,F63,F72,F82 --show-source --statistics

./layers.py:261:14: F821 undefined name 'self'
  if 'ch' in self.norm_style:
             ^
./layers.py:262:14: F821 undefined name 'self'
    ch = int(self.norm_style.split('_')[-1])
             ^
./layers.py:265:17: F821 undefined name 'self'
  elif 'grp' in self.norm_style:
                ^
./layers.py:266:18: F821 undefined name 'self'
    groups = int(self.norm_style.split('_')[-1])
                 ^
./utils.py:1005:35: F632 use ==/!= to compare str, bytes, and int literals
  'Gattn%s' % config['G_attn'] if config['G_attn'] is not '0' else None,
                                  ^
./utils.py:1006:35: F632 use ==/!= to compare str, bytes, and int literals
  'Dattn%s' % config['D_attn'] if config['D_attn'] is not '0' else None,
                                  ^
./train_fns.py:165:28: F821 undefined name 'z_'
                           z_, y_, config['n_classes'],
                           ^
./train_fns.py:165:32: F821 undefined name 'y_'
                           z_, y_, config['n_classes'],
                               ^
./sync_batchnorm/batchnorm_reimpl.py:15:1: F822 undefined name 'BatchNormReimpl' in __all__
__all__ = ['BatchNormReimpl']
^
2     F632 use ==/!= to compare str, bytes, and int literals
6     F821 undefined name 'self'
1     F822 undefined name 'BatchNormReimpl' in __all__
9

E901,E999,F821,F822,F823 are the "showstopper" flake8 issues that can halt the runtime with a SyntaxError, NameError, etc. These 5 are different from most other flake8 issues which are merely "style violations" -- useful for readability but they do not effect runtime safety.

ajbrock commented 5 years ago

Fixed with this push, thanks!