I did some experimenting with KAN over the last few days, and I found it difficult to adjust the hyperparameters when fitting some multi-level functions, such as when composing multiplication with other operations. This is the function that I want to fit. Obviously the smallest structure is [2,2,2,1]. Do the first operation and then do the multiplication. Now I have a situation where I get a good fit numerically (as shown), but the functional expression is wrong, even if I use the strategy of pruning the larger network.
In order to solve this problem, my idea is hierarchical learning, which is the opposite of pruning, I first use a [2,2,1] network to learn the first operation, and then add neurons to the network to learn the second operation. I think this is achievable because KAN is good at continuous learning. I thought it would be easier to express multiple layers of complex functions, but I didn't have an idea. If anyone has ideas, please discuss them with me, thank you!
I did some experimenting with KAN over the last few days, and I found it difficult to adjust the hyperparameters when fitting some multi-level functions, such as when composing multiplication with other operations. This is the function that I want to fit. Obviously the smallest structure is [2,2,2,1]. Do the first operation and then do the multiplication. Now I have a situation where I get a good fit numerically (as shown), but the functional expression is wrong, even if I use the strategy of pruning the larger network.
![image](https://github.com/KindXiaoming/pykan/assets/140362283/8514030b-715f-40de-9fe9-eb10c45b7f13)
In order to solve this problem, my idea is hierarchical learning, which is the opposite of pruning, I first use a [2,2,1] network to learn the first operation, and then add neurons to the network to learn the second operation. I think this is achievable because KAN is good at continuous learning. I thought it would be easier to express multiple layers of complex functions, but I didn't have an idea. If anyone has ideas, please discuss them with me, thank you!