Cytnx-dev / Cytnx

Project Cytnx, A Cross-section of Python & C++,Tensor network library
Apache License 2.0
35 stars 13 forks source link

`cytnx.random.normal()` and `cytnx.random.uniform()` do not work #456

Open pcchen opened 4 weeks ago

pcchen commented 4 weeks ago

For cytnx.random, in place version works:

uT = cytnx.UniTensor.zeros([2,2])
cytnx.random.uniform_(uT, low=-1., high=1.)
cytnx.random.normal_(uT, mean=2., std=3.)

But following codes will results in error

uTa = cytnx.random.uniform(uT, low=-1., high=1.)
uTa = cytnx.random.normal(uT, mean=2., std=3.)

TypeError: object of type 'cytnx.cytnx.UniTensor' has no len()
jeffry1829 commented 4 weeks ago

Hi, the first argument for non-inplace version should be the shape of the output tensor(or Number of elements, if the desired output is 1-D Tensor). Thanks!

pcchen commented 4 weeks ago

Then this means that the syntax for inplace and non-inplace version is very different, which can be confusing.

pcchen commented 3 weeks ago

If I have a UniTensor uT, how do I add random noise to all of the elements?

pcchen commented 3 weeks ago

It seems that non-inplace version only output cytnx.Tensor.

1. normal(Nelem: int, mean: float, std: float, device: int = -1, seed: int = -1, dtype: int = 3) -> cytnx.cytnx.Tensor
2. normal(Nelem: list[int], mean: float, std: float, device: int = -1, seed: int = -1, dtype: int = 3) -> cytnx.cytnx.Tensor

1. uniform(Nelem: int, low: float, high: float, device: int = -1, seed: int = -1, dtype: int = 3) -> cytnx.cytnx.Tensor
2. uniform(Nelem: list[int], low: float, high: float, device: int = -1, seed: int = -1, dtype: int = 3) -> cytnx.cytnx.Tensor

If I would like to add some noise to an existing UniTensor I have to do something like this

uT = cytnx.UniTensor.ones([2,2])
uT_noise=uT.clone()
cytnx.random.uniform_(uT_noise, low=-1., high=1.)
uT = uT+uT_noise

but we don't really need to clone the UniTensor.

It would be useful to have something like

uT_with_noise = cytnx.random.uniform(uT, low=-1., high=1.)

and only metadata of uT is used.

yingjerkao commented 3 weeks ago

It would be useful to have something like

uT_with_noise = cytnx.random.uniform(uT, low=-1., high=1.)

and only metadata of uT is used.

To get this behavior, we probably need a ufunc in numpy which operates on elements.
Since the intention of random.uniform is to give a tensor with random elements, we should stick to the original API. Cloning is not that bad as we still need to create a new tensor from the elements of the old tensor, the amount of memory access should be similar.

We can always add another API to address this adding element-wise noise behavior by wrapping @pcchen's example code.

pcchen commented 3 weeks ago

I am ok with using clone. If we want to add another API, I would suggest to use cytnx.random.uniform_like(uT, low=, high=).

ianmccul commented 3 weeks ago

I was going to suggest something like

uT = uT + cytnx.random.uniform(uT.shape, low=-1., high=1.)

following NumPy syntax. That could also distinguish between a Tensor and a UniTensor.

By the way, I asked ChatGPT about the differences between a Tensor and a UniTensor in cytnx and it gave a pretty good answer!