yoyololicon / pytorch-NMF

A pytorch package for non-negative matrix factorization.
https://pytorch-nmf.readthedocs.io/
MIT License
223 stars 24 forks source link

NMF seems not work #22

Closed WhiteNightSleepless closed 2 years ago

WhiteNightSleepless commented 2 years ago

Hello, When I used the NMF function, I found that the reconstruction matrix WH seems didn't fit the original matrix V. The codes are as follows, based on the example code in the document.

import torch import numpy as np from torchnmf.nmf import NMF if name == 'main': V = torch.rand(5,4) print(V) model = NMF(V.t().shape,rank=2) model.fit(V.t()) WH = model().t() print(WH)

Outputs are here: tensor([[0.9584, 0.4092, 0.5260, 0.4525], [0.1331, 0.6484, 0.8545, 0.2078], [0.1017, 0.0512, 0.9063, 0.6937], [0.2253, 0.0541, 0.3939, 0.3708], [0.7961, 0.1136, 0.7192, 0.8420]]) tensor([[0.5522, 0.3117, 0.8422, 0.6400], [0.1686, 0.7118, 0.7716, 0.1915], [0.4928, 0.0917, 0.5960, 0.5724], [0.2908, 0.0594, 0.3562, 0.3377], [0.7103, 0.1018, 0.8338, 0.8252]], grad_fn=)

You can see that the two matrix are different and should have been approximative. I'm also confused that the function NMF() doesn't use the information of original matrix V, instead it only utilize the shape not the V itself. Could you please give me some advices, thanks.

WhiteNightSleepless commented 2 years ago

Ok, I make sense of it now. It seems that the accuracy of reconstruction matrix WH is related to the r. If I choose r=1 or 2, WH is very different to the original matrix V. But if I choose 4 (min(5, 4)), they are very similar. Like these:

tensor([[0.6679, 0.3305, 0.2780, 0.1811], [0.3547, 0.9869, 0.2612, 0.6273], [0.4995, 0.0084, 0.5115, 0.6788], [0.5647, 0.2001, 0.4642, 0.1505], [0.4068, 0.4331, 0.9313, 0.0144]]) tensor([[0.6682, 0.3298, 0.2782, 0.1814], [0.3548, 0.9872, 0.2612, 0.6271], [0.4996, 0.0084, 0.5119, 0.6782], [0.5642, 0.2017, 0.4626, 0.1510], [0.4069, 0.4319, 0.9324, 0.0144]], grad_fn=)