Closed spcyppt closed 3 weeks ago
This pull request was exported from Phabricator. Differential Revision: D61314676
Name | Link |
---|---|
Latest commit | 504bd957ae10354b90a70451805f7e9b832a830a |
Latest deploy log | https://app.netlify.com/sites/pytorch-fbgemm-docs/deploys/66c3789fb91180000881f60d |
Deploy Preview | https://deploy-preview-3006--pytorch-fbgemm-docs.netlify.app |
Preview on mobile | Toggle QR Code...Use your smartphone camera to open QR code link. |
To edit notification comments on pull requests, go to your Netlify site configuration.
This pull request was exported from Phabricator. Differential Revision: D61314676
This pull request was exported from Phabricator. Differential Revision: D61314676
This pull request was exported from Phabricator. Differential Revision: D61314676
This pull request was exported from Phabricator. Differential Revision: D61314676
This pull request was exported from Phabricator. Differential Revision: D61314676
This pull request has been merged in pytorch/FBGEMM@81efd37168039997d84bfc0134e1333e2140bc98.
This pull request has been reverted by a4a66611e951b4c4790cb0e1c37aa3232876f9e3.
Summary: X-link: https://github.com/facebookresearch/FBGEMM/pull/99
pack_segments
backward ops works incorrectly, i.e., the input's gradient will be incorrect when the backward input grad_out is not contiguous.This diff ensures the gradients are contiguous. Unit test is also added.
https://fb.workplace.com/groups/2126278550786248/posts/8368626576551383/
Differential Revision: D61314676