dseinternational / open-dotnet

Libraries and tools supporting DSE Open systems and technologies.
MIT License
0 stars 0 forks source link

Bump TorchSharp from 0.102.7 to 0.102.8 #140

Closed dependabot[bot] closed 3 months ago

dependabot[bot] commented 3 months ago

Bumps TorchSharp from 0.102.7 to 0.102.8.

Changelog

Sourced from TorchSharp's changelog.

NuGet Version 0.102.8

Bug Fixes:

#1359 torch.nn.functional.l1_loss computes a criterion with the MSE, not the MAE.

NuGet Version 0.102.6

Breaking Changes:

When creating a tensor from a 1-D array, and passing in a shape, there is now an ambiguity between the IList and Memory overloads of torch.tensor(). The ambiguity is resolved by removing the dimensions argument if it is redundant, or by an explicit cast to IList if it is not.

API Changes:

#1326 Allow arrays used to create tensors to be larger than the tensor. Create tensors from a Memory instance.

Bug Fixes:

#1334 MultivariateNormal.log_prob() exception in TorchSharp but works in pytorch.

NuGet Version 0.102.5

Breaking Changes:

torchvision.dataset.MNIST will try more mirrors. The thrown exception might be changed when it fails to download MNIST, FashionMNIST or KMNIST. ObjectDisposedException will now be thrown when trying to use a disposed dispose scopes. The constructor of dispose scopes is no longer public. Use torch.NewDisposeScope instead.

API Changes:

#1317 How to set default device type in torchsharp. #1314 Grant read-only access to DataLoader attributes #1313 Add 'non_blocking' argument to tensor and module 'to()' signatures. #1291 Tensor.grad() and Tensor.set_grad() have been replaced by a new property Tensor.grad. A potential memory leak caused by set_grad has been resolved. Include method of dispose scopes has been removed. Use Attach instead. Two more Attach methods that accepts IEnumerable<IDisposable>s and arrays as the parameter have been added into dispose scopes. A new property torch.CurrentDisposeScope has been added to provide the ability to get the current dispose scope. Add module hooks that take no input/output arguments, just the module itself.

Bug Fixes:

#1300 Adadelta, Adam and AdamW will no longer throw NullReferenceException when maximize is true and grad is null. torch.normalwill now correctly return a leaf tensor.<br/> New optionsdisposeBatchanddisposeDatasethave been added intoDataLoader`. The default collate functions will now always dispose the intermediate tensors, rather than wait for the next iteration.

Bug Fixes:

TensorDataset will now keep the aliases detached from dispose scopes, to avoid the unexpected disposal.

... (truncated)

Commits


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)