Open Rome1453 opened 5 months ago
The sigmoid can be used. Since the KAN module provides a foward method, using it like the old nn.Linear() didn't cause too many problems. " from kan import KAN from torch import nn class Model(nn.Module): Model. def init(self): super(Model, self).init() self.KAN = KAN([dim,dim*2-1,1]) def forward(self, x): output = self.KAN(x) output = nn.Sigmoid()(output) output = torch.squeeze(output) return output"
After creating the code in this way, we proceeded to train and the classification learning went well on our dataset!
@pop756 Thank you for your reply, I'm new to torch so i'm not skilled. I'll have a try as your way, thank you.
The sigmoid can be used. Since the KAN module provides a foward method, using it like the old nn.Linear() didn't cause too many problems. " from kan import KAN from torch import nn class Model(nn.Module): Model. def init(self): super(Model, self).init() self.KAN = KAN([dim,dim*2-1,1]) def forward(self, x): output = self.KAN(x) output = nn.Sigmoid()(output) output = torch.squeeze(output) return output"
After creating the code in this way, we proceeded to train and the classification learning went well on our dataset!
Hey. Can you make available the rest of the code to train the model please? Thank you!
Notice that you present the classification problem in example3, but treat the problem as a regression problem. My question is, is there any way KAN can limit the output to 0-1 in classification problems (like sigmoid)? If not, is the situation in example3 more akin to treating different labels as consecutive values? When I test the data, I found that with the noise of the sample larger, the curve of the formula was not like the distribution of the sample, nor the decision boundary ,so Im confused. Im new to deep learning, so I would appreciate it if you could point out my mistakes in my understanding. Thanks!