Closed 2niuhe closed 1 month ago
Yes, absolutely. They are on our list.
On Thu, Aug 29, 2024 at 7:51 PM niu_he @.***> wrote:
Please describe your question
Hi team,
I recently noticed that we've only patched the basic kernel for PyTorch operators. I'm wondering if there's a plan to support in-place replacement and out operators, such as abs_ and abs with an out argument.
Here's an example:
In [12]: a = torch.randn(3, 3, device='cuda') In [13]: b = torch.empty(3, 3, device='cuda') In [14]: logging.debug = print In [15]: with flag_gems.use_gems(): ...: torch.abs(a) ...: GEMS ABS In [16]: with flag_gems.usegems(): ...: torch.abs(a) ...: In [17]: with flag_gems.use_gems(): ...: torch.abs(a, out=b)
— Reply to this email directly, view it on GitHub https://github.com/FlagOpen/FlagGems/issues/192, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA5ZB2VXIIQJIRRBILZEMRLZT4DLPAVCNFSM6AAAAABNKIAXOSVHI2DSMVQWIX3LMV43ASLTON2WKOZSGQ4TIMRRG42DQMA . You are receiving this because you are subscribed to this thread.Message ID: @.***>
-- Sent from Gmail Mobile
Please describe your question
Hi team,
I recently noticed that we've only patched the basic kernel for PyTorch operators. I'm wondering if there's a plan to support in-place replacement and out operators, such as abs_ and abs with an out argument.
Here's an example: