Jinfeng-Xu / FKAN-GCF

MIT License
24 stars 6 forks source link

What do you mean by efficiency in the context of your work? #1

Closed cooma04 closed 3 months ago

cooma04 commented 3 months ago

Hi, I am wondering what it means by efficiency (or efficient feature transformation) in your work? Do you achieve better efficiency (computation, parameters, memory usage, etc.) in FKAN-GCF than when you don't use KAN in GCNs?

Also, in your paper work, Can you please expand on "The Fourier transform has a significant advantage in computational efficiency and solves the training difficulty caused by the spline function."? because I am not able to see you touching on Fouier Transform anywhere.

Thanks in advance.

Jinfeng-Xu commented 3 months ago

Hi, I am wondering what it means by efficiency (or efficient feature transformation) in your work? Do you achieve better efficiency (computation, parameters, memory usage, etc.) in FKAN-GCF than when you don't use KAN in GCNs?

Also, in your paper work, Can you please expand on "The Fourier transform has a significant advantage in computational efficiency and solves the training difficulty caused by the spline function."? because I am not able to see you touching on Fouier Transform anywhere.

Thanks in advance.

Many thanks for your questions. Sorry for the unclear statement. Fourier series ✅ Fourier transfor ❌. The description in the arxiv paper will be updated in the near future.

LightGCN achieves more efficient and effective results by simplifying the nonlinear activation function and feature transformation parts of NGCF. As claimed in the LightGCN paper, they argue that NGCF has a stronger representation and is just harder to train. In terms of performance both KAN and FourierKAN can achieve more competitive results than NGCF. Also they are more flexible. For example, for FourierKAN, we are free to choose a smaller grid size when training on relatively sparse datasets.

Both KAN-GCF and FourierKAN-GCF have a smaller number of parameters than NGCF. However, the reason that motivates us to choose FourierKAN over KAN is that KAN converges slower in the recommendation scenario. FourierKAN slightly optimises this problem, but further exploration is still needed.

As mentioned in our repo., there have been doubts about the usefulness of the feature transformation part of GCN in the recommendation domain, which fits well with our thinking about the scenarios in which KAN is more effective than MLP. More interesting experiments and findings still need to be explored further!

Fourier coefficients is always easier to train than the Spline function (standard KAN). Thanks to KAN and FourierKAN.