Lightning-AI / pytorch-lightning

Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.
https://lightning.ai
Apache License 2.0
28.11k stars 3.36k forks source link

Remove 401 by use __all__ #11592

Open chunyang-wen opened 2 years ago

chunyang-wen commented 2 years ago

Proposed refactor

We can add __all__ in __init__.py file in order to remove F401 error and avoids the module imports.

{
    ...,
    'accelerator': <module 'pytorch_lightning.accelerators.accelerator' from '/pytorch-lightning/pytorch_lightning/accelerators/accelerator.py'>,
    'Accelerator': <class 'pytorch_lightning.accelerators.accelerator.Accelerator'>,
    'cpu': <module 'pytorch_lightning.accelerators.cpu' from '/pytorch-lightning/pytorch_lightning/accelerators/cpu.py'>,
    'CPUAccelerator': <class 'pytorch_lightning.accelerators.cpu.CPUAccelerator'>,
    'gpu': <module 'pytorch_lightning.accelerators.gpu' from '/pytorch-lightning/pytorch_lightning/accelerators/gpu.py'>,
    'GPUAccelerator': <class 'pytorch_lightning.accelerators.gpu.GPUAccelerator'>,
    'ipu': <module 'pytorch_lightning.accelerators.ipu' from '/pytorch-lightning/pytorch_lightning/accelerators/ipu.py'>,
    'IPUAccelerator': <class 'pytorch_lightning.accelerators.ipu.IPUAccelerator'>,
    'tpu': <module 'pytorch_lightning.accelerators.tpu' from '/pytorch-lightning/pytorch_lightning/accelerators/tpu.py'>,
    'TPUAccelerator': <class 'pytorch_lightning.accelerators.tpu.TPUAccelerator'>,
}
{
    ...,
    'Accelerator': <class 'pytorch_lightning.accelerators.accelerator.Accelerator'>,
    'CPUAccelerator': <class 'pytorch_lightning.accelerators.cpu.CPUAccelerator'>,
    'GPUAccelerator': <class 'pytorch_lightning.accelerators.gpu.GPUAccelerator'>,
    'IPUAccelerator': <class 'pytorch_lightning.accelerators.ipu.IPUAccelerator'>,
    'TPUAccelerator': <class 'pytorch_lightning.accelerators.tpu.TPUAccelerator'>,
}

Motivation

Remove annoying 401 error.

Pitch

No error reported by mypy or pre-commit

cc @borda @justusschock @awaelchli @akihironitta @rohitgr7

Borda commented 2 years ago

I am not sure if want to amplify the visibility of these internal tools...

justusschock commented 2 years ago

What's internal there? They are supposed to be stable with 1.6 as user-interface for power users...

chunyang-wen commented 2 years ago

__all__ does not mean you have to expose everything (You can choose what to expose by yourself). Even you do not want to expose the modules, users can still import them. What I suggest here is to use __all__ to remove the #noqa F401 error or comment.

image
akihironitta commented 2 years ago

@chunyang-wen Sounds good to me!


(just for the note) I believe this is technically a breaking change for users importing all names:

from pytorch_lightning.accelerators import *
...
ananthsub commented 2 years ago

Some things like get_nvidia_gpu_stats are in the gpu accelerator module but not in the GPUAccelerator class definition. Will this break those imports?

chunyang-wen commented 2 years ago

Yes, it will break users' code if they use modules automatically imported by python.

from pytorch_lightning.accelerators import *

accelerator.blabla
cpu.blabla

for compability, we can also export the modules at the same time.

from pytorch_lightning.accelerators import accelerator
from pytorch_lightning.accelerators.accelerator import Accelerator

__all__ = ["accelerator", "Accelerator"]
adshukla1902 commented 2 years ago

Hi, Can I start working on this issue, this will be my first contribution.

justusschock commented 2 years ago

I am not sure if there actually was an agreement to do this

thoughts @Lightning-AI/core-lightning

Borda commented 2 years ago

I am fine with this addition of __all__ in each init