Closed gbaydin closed 4 years ago
OK!
Assuming we get the entire API generated then we should look again at DiffSharp operations extensibility - if there are 1000s of thing available in TorchSharp then you and other users will want to map those operations up into DiffSharp (without modifying the internals of DiffSharp, RawTensor etc.).
@gbaydin I've added these in #155
In some places there were only the in place operations available, and I'm not sure they are entirely equivalent to thei python counterparts, but should be enough to implement on top of I believe.
I couldn't find C++ equivalents of
torch.cuda.manual_seed
torch.cuda.manual_seed_all
Merged
I couldn't find C++ equivalents of torch.cuda.manual_seed torch.cuda.manual_seed_all
Ok, it looks like the behavior was changed some time last year so that torch.manual_seed
takes care of setting seeds for both CPU and CUDA. https://discuss.pytorch.org/t/difference-between-torch-manual-seed-and-torch-cuda-manual-seed/13848/4
It would be good to test this to ensure it behaves as expected.
Hi all, thank you for all the awesome work in this repo!
I would like to open this issue to keep a running list of some operations we will need for DiffSharp for which @dsyme has been implementing a TorchSharp-based backend.
Needed operations:
From
Torch.Tensor
:randn
, so not so important or urgent)rand
, so not so important or urgent)Others:
torch.manual_seed
torch.cuda.manual_seed
torch.cuda.manual_seed_all