pytorch / executorch

On-device AI across mobile, embedded and edge for PyTorch
https://pytorch.org/executorch/
Other
2.22k stars 368 forks source link

Fix ReLU fusion when conv/linear has > 1 user #6894

Closed mcr229 closed 1 week ago

mcr229 commented 1 week ago

Summary: Bug in quantizer when Conv + ReLU is fused even when the preceeding conv has more than one user. Conv and ReLU can not be fused in this case because the result of Conv must be used elsewhere.

XNNPACK Delegate naturally handles this by inserting a clamp node for ReLU.

Reviewed By: digantdesai

Differential Revision: D65989599

pytorch-bot[bot] commented 1 week ago

:link: Helpful Links

:test_tube: See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/6894

Note: Links to docs will display an error until the docs builds have been completed.

:heavy_exclamation_mark: 2 Active SEVs

There are 2 currently active SEVs. If your PR is affected, please view them below:

:white_check_mark: No Failures

As of commit 84cf62e818c13570a29ec9e461393708f3c88511 with merge base 5b4d9bbf4a7d6e23ad2d4a7a575ecc66d588664b (image): :green_heart: Looks good so far! There are no failures yet. :green_heart:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

facebook-github-bot commented 1 week ago

This pull request was exported from Phabricator. Differential Revision: D65989599

facebook-github-bot commented 1 week ago

This pull request was exported from Phabricator. Differential Revision: D65989599

facebook-github-bot commented 1 week ago

This pull request was exported from Phabricator. Differential Revision: D65989599

facebook-github-bot commented 1 week ago

This pull request was exported from Phabricator. Differential Revision: D65989599