Artelnics / opennn

OpenNN - Open Neural Networks Library
http://www.opennn.net
GNU Lesser General Public License v3.0
1.12k stars 353 forks source link

the number of weight parameters #226

Closed Lxy85 closed 2 years ago

Lxy85 commented 2 years ago

Hi, I want to use the MLP model in opennn for federated learning. I need to ensure that even if I use models from different datasets, the number of weight parameters for each layer is the same. Is there any solution? Thank you!

sandstromviktor commented 2 years ago

What do you mean by use models from different datasets? I assume that you mean to use the same architecture for different datasets, and verify that you are indeed using the same architecture on each federation. If thats the case:

Just loop through each layer and and use the get_parameters_number() function? Assuming you have a neural-network called neural_network, you run Tensor<Layer*, 1> layers_pointers = neural_network.get_layers_pointers(); then

for(Index i = 0; i < layers_pointers.size(); i++){
     Index num_parameters = layers_pointers(i)->get_parameters_number();
}

Then compare the values somehow. I havn't verified that this works, but you get the idea how to go about solving your problemm

Lxy85 commented 2 years ago

@sandstromviktor, thank you. It's like this, I use a data set of my own, the data set contains 3000 pieces of data, a total of five categories. I split this data set into 4 sub-data sets, each sub-data set contains five categories, and each category is 150 pieces of data. When training for each sub-data set, I used the function neural_network.get_trainable_parameters_number() to get the number of parameters and found that it was different. I would like to ask if there is a way to make this parameter keep the same number for this situation.

davidge807 commented 2 years ago

Hi @Lxy85

I don't understand why you want to have the same number of parameters in two different models, with different input number.

It's not possible to mantain parameters number the same, but you can set neurons number for each layer to any number you want. This has more sense and can be set easily when creating the NeuralNetwork object.

Thank you for your question and interest in OpenNN, David.

sandstromviktor commented 1 year ago

@Lxy85 , I'll answer since i don't think @davidge807 provided a good answer. If you have a dataset on the form dataset = (num_samples, input_dim, label_dim)and you want to split that dataset into 4 pieces such that dataset_n = (num_samples/4, input_dim, label_dim), there must be something wrong with your code since the input and label dimensions should remain the same if you split the dataset into smaller pieces. Your neural network should take input on the form (batch_size, input_dim) and output (batch_size, label_dim) and thus be invariant to the number of data samples you use. Hope this helps.