Open carmocca opened 2 years ago
Looks like this would break the LightningCLI
so will have to wait for a while
I tired the isort
and pyupgrade
with https://github.com/omni-us/jsonargparse/pull/294. Then ran the test_cli.py tests in python 3.8. Fails because pydantic doesn't do backporting of 585 and 604 types.
ImportError while loading conftest 'tests/tests_pytorch/conftest.py'.
tests/tests_pytorch/conftest.py:29: in <module>
import lightning.fabric
src/lightning/__init__.py:20: in <module>
from lightning.app import storage # noqa: E402
src/lightning/app/__init__.py:26: in <module>
from lightning.app import components # noqa: E402, F401
src/lightning/app/components/__init__.py:3: in <module>
from lightning.app.components.database.client import DatabaseClient
src/lightning/app/components/database/__init__.py:4: in <module>
from lightning.app.components.database.server import Database
src/lightning/app/components/database/server.py:31: in <module>
from lightning.app.core.work import LightningWork
src/lightning/app/core/__init__.py:3: in <module>
from lightning.app.core.app import LightningApp
src/lightning/app/core/app.py:41: in <module>
from lightning.app.core.work import LightningWork
src/lightning/app/core/work.py:31: in <module>
from lightning.app.utilities.app_status import WorkStatus
src/lightning/app/utilities/app_status.py:23: in <module>
class WorkStatus(BaseModel):
../../../.local/lib/python3.8/site-packages/pydantic/main.py:178: in __new__
annotations = resolve_annotations(namespace.get('__annotations__', {}), namespace.get('__module__', None))
../../../.local/lib/python3.8/site-packages/pydantic/typing.py:399: in resolve_annotations
value = _eval_type(value, base_globals, None)
/usr/lib/python3.8/typing.py:270: in _eval_type
return t._evaluate(globalns, localns)
/usr/lib/python3.8/typing.py:518: in _evaluate
eval(self.__forward_code__, globalns, localns),
E TypeError: unsupported operand type(s) for |: 'type' and 'NoneType'
I guess this could be done for the rest of the code that does not depend on pydantic. Though it might be a much more manual process.
Yes. We could do the upgrade separately for fabric, the trainer, and apps.
There might be newer ways to upgrade the codebase. The instructions in the top post are quite old by now.
There might be newer ways to upgrade the codebase. The instructions in the top post are quite old by now. I searched but didn't find anything different.
I tried to upgrade only lightning/pytorch
as:
isort -a "from __future__ import annotations" --append-only src/lightning/pytorch
pyupgrade --py37-plus $(find src/lightning/pytorch -name '*.py')
Then I had to manually fix the future import in src/lightning/pytorch/loggers/neptune.py
.
With this, there is a different error. Upgrading the code does not seem like a simple task.
ImportError while loading conftest 'tests/tests_pytorch/conftest.py'.
tests/tests_pytorch/conftest.py:27: in <module>
import lightning.fabric
src/lightning/__init__.py:27: in <module>
from lightning.pytorch.callbacks import Callback # noqa: E402
src/lightning/pytorch/__init__.py:28: in <module>
from lightning.pytorch.callbacks import Callback # noqa: E402
src/lightning/pytorch/callbacks/__init__.py:31: in <module>
from lightning.pytorch.callbacks.pruning import ModelPruning
src/lightning/pytorch/callbacks/pruning.py:30: in <module>
from lightning.pytorch.core.module import LightningModule
src/lightning/pytorch/core/__init__.py:18: in <module>
from lightning.pytorch.core.module import LightningModule
src/lightning/pytorch/core/module.py:64: in <module>
from lightning.pytorch.trainer import call
src/lightning/pytorch/trainer/__init__.py:19: in <module>
from lightning.pytorch.trainer.trainer import Trainer
src/lightning/pytorch/trainer/trainer.py:45: in <module>
from lightning.pytorch.loops import _PredictionLoop, _TrainingEpochLoop
src/lightning/pytorch/loops/__init__.py:16: in <module>
from lightning.pytorch.loops.evaluation_loop import _EvaluationLoop # noqa: F401
src/lightning/pytorch/loops/evaluation_loop.py:31: in <module>
from lightning.pytorch.loops.utilities import _no_grad_context, _select_data_fetcher, _verify_dataloader_idx_requirement
src/lightning/pytorch/loops/utilities.py:29: in <module>
from lightning.pytorch.loops import _Loop
E ImportError: cannot import name '_Loop' from partially initialized module 'lightning.pytorch.loops' (most likely due to a circular import) (src/lightning/pytorch/loops/__init__.py)
If you open a PR with the changes, I might be able to help with fixing the circular import
Created #17779.
Proposed refactor
We've dropped support for Python 3.6, which means we can use the new annotations format.
Motivation
New shiny things are nice
Pitch
Adopt PEP 585, PEP 604, and PEP 563.
We can do this automatically with the following sequence of commands:
Although some manual cleanup will be necessary because isort will add the import to all files, even those who don't need the
__future__
importAdditional context
In regards to non-quoted annotations, there are talks of replacing PEP 563 with PEP 649. I don't think it impacts our project, but depending on the resolution of the latter the import statement might change.
Also, we might want to delay this until 1.6 will be released, to avoid conflicts with the bug-fix branch.
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
cc @justusschock @awaelchli @akihironitta