dotnet / TorchSharp

A .NET library that provides access to the library that powers PyTorch.
MIT License
1.42k stars 184 forks source link

missing stuffs in torch.nn.utils #1243

Open yueyinqiu opened 9 months ago

yueyinqiu commented 9 months ago

shaltielshmid commented 9 months ago

Hi @yueyinqiu ! Thanks for writing up the diff. I went through all the methods/modules that were missing in TorchSharp, and they are all pure PyTorch code that don't appear in LibTorch (the underlying C++ library). This means that we would need to rewrite all the methods ourselves in TorchSharp, which can take time to write properly. Do you have any specific methods that are more important for you in the short term? If so, I can dedicate some time to porting those over in the shorter term.

All contributions are more than welcome, so if you want to port some of the functions as well, that would be great!

[Side note: The clip_grad_norm function in PyTorch is deprecated and just calls the clip_grad_norm_ function, which exists in TorchSharp as well].

yueyinqiu commented 9 months ago

Hi @yueyinqiu ! Thanks for writing up the diff. I went through all the methods/modules that were missing in TorchSharp, and they are all pure PyTorch code that don't appear in LibTorch (the underlying C++ library). This means that we would need to rewrite all the methods ourselves in TorchSharp, which can take time to write properly. Do you have any specific methods that are more important for you in the short term? If so, I can dedicate some time to porting those over in the shorter term.

All contributions are more than welcome, so if you want to port some of the functions as well, that would be great!

[Side note: The clip_grad_norm function in PyTorch is deprecated and just calls the clip_grad_norm_ function, which exists in TorchSharp as well].

Well in fact I'm just a newbie to deep learning and I was trying reproducing some others' work which used spectral_norm. But I have later found that they skipped all the spectral_norms in their final configuration. So it's not that urgent and I'm just making a note here.

Actually I'd like to contribute the project. It's so great to be able to use C# instead of Python. However I'm afraid that my unfamiliarity with deep learning and PyTorch will mess everything up :(

(Please feel free to edit my list, like just remove the clip_grad_norm here if you think it's appropriate.)

yueyinqiu commented 8 months ago

Hi @yueyinqiu ! Thanks for writing up the diff. I went through all the methods/modules that were missing in TorchSharp, and they are all pure PyTorch code that don't appear in LibTorch (the underlying C++ library). This means that we would need to rewrite all the methods ourselves in TorchSharp, which can take time to write properly. Do you have any specific methods that are more important for you in the short term? If so, I can dedicate some time to porting those over in the shorter term.

All contributions are more than welcome, so if you want to port some of the functions as well, that would be great!

[Side note: The clip_grad_norm function in PyTorch is deprecated and just calls the clip_grad_norm_ function, which exists in TorchSharp as well].

I've tried to add fuse_conv_bn_weights and fuse_linear_bn_weights in #1262 . Could you please take a look and give me some advice?

NiklasGustafsson commented 8 months ago

@yueyinqiu -- I just merged your PR. Please edit the to-do list in the first comment to reflect your changes.

yueyinqiu commented 8 months ago

@yueyinqiu -- I just merged your PR. Please edit the to-do list in the first comment to reflect your changes.

yes and thanks a lot

NiklasGustafsson commented 5 months ago

@yueyinqiu -- I was taking a look at this again -- the fuse_***_eval methods, how commonly used are those? They seem straight-forward enough, but how important are they to have?

yueyinqiu commented 5 months ago

Hmm... I'm not sure. I'm not really familiar with deep learning. But implementing this won't cost a lot, I think?

Are you considering this because of #1259, and hesitating whether to have generalized types for Convs and BatchNorms? I think those types would be great, but it's really hard to design them well, and it won't bother too much to use overloads if we don't have such general types.

NiklasGustafsson commented 5 months ago

fuse_linear_bn_eval and fuse_conv_bn_eval will be easier to implement once we have the restructured module implementations merged, so I'll hold off until then.

yueyinqiu commented 5 months ago

I think that would be a good idea.