pytorch / xla

Enabling PyTorch on XLA Devices (e.g. Google TPU)
https://pytorch.org/xla
Other
2.46k stars 467 forks source link

Op info test for `nn.functional.prelu .. nn.functional.silu` #7561

Closed qihqi closed 1 month ago

qihqi commented 3 months ago

Fix the Op info test for nn.functional.prelu .. nn.functional.silu

  1. Find the lines 433 to 437 of test_ops.py and remove nn.functional.prelu .. nn.functional.silu from skip_list
  2. Run op_info test with pytest test/test_ops.py
  3. Fix the failure.

Please refer to this guide as guide to fix:

Also refer to these PRs: