pytorch / xla

Enabling PyTorch on XLA Devices (e.g. Google TPU)
https://pytorch.org/xla
Other
2.48k stars 480 forks source link

[RFC] `torch_xla` Backward Compatibility Proposal #8000

Open zpcore opened 2 months ago

zpcore commented 2 months ago

Recently, we have started the process to reduce the torch_xla API footprint in favor of torch API to improve the usability. This RFC focuses on the process to deprecate any functions.

Backward compatibility

We propose to offer a 6 months (2 releases) grace period before completely removing the deprecated API. As is shown in the graph below:

Screenshot 2024-09-12 at 1 47 03 PM

Developers should follow the illustrated timeline with the following action:

If we follow the timeline, the deprecated API should still be usable for two releases, in which we guarantee backward compatibility.

For each deprecated API, mention it in the release X’s release note including what’s the suggested new APIs and when to completely deprecate the old one.

Actions to take for deprecation:

Github actions for API deprecation

Before deprecate any APIs, create a github issue to include the following details:

In torch_xla/runtime.py

def world_size(): ...


- Use @mark_deprecated decorator:
```python
# In torch_xla/core/xla_model.py:
from torch_xla.experimental.deprecation import mark_deprecated

@mark_deprecated(torch_xla.runtime.world_size, extra_msg='xrt_world_size() will be removed in release 2.7.')
def xrt_world_size():
  ...

# In torch_xla/[runtime.py](http://runtime.py/), define the new function:
def world_size():
  ...
zpcore commented 2 months ago

@miladm @ManfeiBai @JackCaoG

ManfeiBai commented 2 months ago

Thanks, LGTM, let's add in 2.5 release

miladm commented 1 month ago

Can we pin this issue to the top of the issue list on GH? @zpcore

zpcore commented 1 month ago

Can we pin this issue to the top of the issue list on GH? @zpcore

Issue pinned now.