Closed tahwaru closed 3 years ago
Yes of course, The only difference between classification (real labels, what I normally put in the examples) and regressions (what you want) are:
In my case, loss functions are Tensorflow's, so just use whatever you like (example mean_squared_error). And don't add the real softmax function to the last layer.
Here you have extracts taken from this link.
Classification
input_layer = Input(shape=(X.shape[1],))
dense_layer_1 = Dense(15, activation='relu')(input_layer)
dense_layer_2 = Dense(10, activation='relu')(dense_layer_1)
output = Dense(y.shape[1], activation='softmax')(dense_layer_2)
model = Model(inputs=input_layer, outputs=output)
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['acc'])
Regression
input_layer = Input(shape=(X.shape[1],))
dense_layer_1 = Dense(100, activation='relu')(input_layer)
dense_layer_2 = Dense(50, activation='relu')(dense_layer_1)
dense_layer_3 = Dense(25, activation='relu')(dense_layer_2)
output = Dense(1)(dense_layer_3)
model = Model(inputs=input_layer, outputs=output)
model.compile(loss="mean_squared_error" , optimizer="adam", metrics=["mean_squared_error"])
Thanks!!! Let me close the issue at the moment. I will run some tests and get back to you.
Hi Barrachina,
Thank you for providing the CVNN library.
I would like to train complex-valued datasets with complex-valued labels. Is that possible? If yes, please could you explain how?