Open yousifj129 opened 2 months ago
What do you mean? I can try to help with this issue as I am very interested in this repository and its function
What do you mean? I can try to help with this issue as I am very interested in this repository and its function
as you see, we have activation functions we can use: self.activation = QComboBox() self.activation.addItems(["relu", "sigmoid", "softmax", "tanh", "linear", "softplus","softsign","selu","elu","exponential","leaky_relu", "relu6","silu","hard_silu","gelu","hard_sigmoid","mish","log_softmax"]) layer_layout.addWidget(QLabel("Activation Function:")) layer_layout.addWidget(self.activation)
and in the create model function: def create_model(self): self.model = keras.Sequential() self.model.add(keras.layers.Input(shape=(self.input_dim,))) activ = self.activation.currentText() for i in range(self.layer_list.count()): layer_str = self.layer_list.item(i).text() layer_type, params = layer_str.split(': ')
if layer_type == "Dense":
units = int(params)
self.model.add(keras.layers.Dense(units, activation=activ))
elif layer_type == "Conv1D":
filters, kernel_size = map(int, params.split(','))
self.model.add(keras.layers.Conv1D(filters, kernel_size, activation=activ))
elif layer_type == "LSTM":
units = int(params)
self.model.add(keras.layers.LSTM(units, activation=activ))
elif layer_type == "Dropout":
rate = float(params)
self.model.add(keras.layers.Dropout(rate))
self.model.add(keras.layers.Dense(self.output_dim))
we are taking the current activ function, but we take it for every layer, which is a problem, we should make some list variable that contains each layer activation function so each layer can have its own activation
I will test a few things and see if I can try to help with this.
EDIT: Is this built for Windows? If so I am sorry but I run linux
I will test a few things and see if I can try to help with this.
EDIT: Is this built for Windows? If so I am sorry but I run linux
I think it should run just fine on Linux, it's just tensorflow and pyside6
the user can only choose one activation function for all layers, make the user able to choose different activation function whenever he wants