Ensure gradient clear out pending AsyncCollectiveTensor in FSDP Extension (pytorch/pytorch#116122)
Fix processing unflatten tensor on compute stream in FSDP Extension (pytorch/pytorch#116559)
Fix FSDP AssertionError on tensor subclass when setting sync_module_states=True (pytorch/pytorch#117336)
Fix DCP state_dict cannot correctly find FQN when the leaf module is wrapped by FSDP (pytorch/pytorch#115592)
Fix OOM when when returning a AsyncCollectiveTensor by forcing _gather_state_dict() to be synchronous with respect to the mian stream. (pytorch/pytorch#118197) (pytorch/pytorch#119716)
Fix Windows runtime torch.distributed.DistNetworkError: [WinError 32] The process cannot access the file because it is being used by another process (pytorch/pytorch#118860)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Bumps torch from 2.2.0 to 2.2.1.
Release notes
Sourced from torch's releases.
Commits
6c8c5ad
[RelEng] DefineBUILD_BUNDLE_PTXAS
(#119750) (#119988)f00f0ab
fix compile DTensor.from_local in trace_rule_look up (#119659) (#119941)077791b
Revert "Update state_dict.py to propagate cpu offload (#117453)" (#119995)3eaaeeb
Update state_dict.py to propagate cpu offload (#117453) (#119916)0aa3fd3
HSDP + TP integration bug fixes (#119819)eef51a6
[Inductor] Skip triton templates for mixedmm on SM70- (#118591) (#119894)940358f
[dtensor] fix dtensor _to_copy op for mix precision (#116426) (#119687)24e4751
[state_dict] Calls wait() for the DTensor to_local() result (#118197) (#119692)dcaeed3
[DCP][state_dict] Fix the issue that get_state_dict/set_state_dict ig… (#119807)4f882a5
Properly preserve SymInt input invariant when splitting graphs (#117406) (#11...Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show
Originally posted by @dependabot in https://github.com/Bryan-Roe/semantic-kernel/pull/12