dotnet / TorchSharp

A .NET library that provides access to the library that powers PyTorch.
MIT License
1.4k stars 182 forks source link

Operations requested for DiffSharp #140

Closed gbaydin closed 4 years ago

gbaydin commented 4 years ago

Hi all, thank you for all the awesome work in this repo!

I would like to open this issue to keep a running list of some operations we will need for DiffSharp for which @dsyme has been implementing a TorchSharp-based backend.

Needed operations:

From Torch.Tensor:

Others:

dsyme commented 4 years ago

OK!

Assuming we get the entire API generated then we should look again at DiffSharp operations extensibility - if there are 1000s of thing available in TorchSharp then you and other users will want to map those operations up into DiffSharp (without modifying the internals of DiffSharp, RawTensor etc.).

dsyme commented 4 years ago

@gbaydin I've added these in #155

In some places there were only the in place operations available, and I'm not sure they are entirely equivalent to thei python counterparts, but should be enough to implement on top of I believe.

I couldn't find C++ equivalents of

torch.cuda.manual_seed
torch.cuda.manual_seed_all
dsyme commented 4 years ago

Merged

gbaydin commented 4 years ago

I couldn't find C++ equivalents of torch.cuda.manual_seed torch.cuda.manual_seed_all

Ok, it looks like the behavior was changed some time last year so that torch.manual_seed takes care of setting seeds for both CPU and CUDA. https://discuss.pytorch.org/t/difference-between-torch-manual-seed-and-torch-cuda-manual-seed/13848/4

It would be good to test this to ensure it behaves as expected.