NEGU93 / cvnn

Library to help implement a complex-valued neural network (cvnn) using tensorflow as back-end
https://complex-valued-neural-networks.readthedocs.io/
MIT License
164 stars 34 forks source link

Complex-valued labels? #5

Closed tahwaru closed 3 years ago

tahwaru commented 3 years ago

Hi Barrachina,

Thank you for providing the CVNN library.

I would like to train complex-valued datasets with complex-valued labels. Is that possible? If yes, please could you explain how?

NEGU93 commented 3 years ago

Yes of course, The only difference between classification (real labels, what I normally put in the examples) and regressions (what you want) are:

  1. Loss function
  2. Activation of the last layer.

In my case, loss functions are Tensorflow's, so just use whatever you like (example mean_squared_error). And don't add the real softmax function to the last layer.

Here you have extracts taken from this link.

Classification

input_layer = Input(shape=(X.shape[1],))
dense_layer_1 = Dense(15, activation='relu')(input_layer)
dense_layer_2 = Dense(10, activation='relu')(dense_layer_1)
output = Dense(y.shape[1], activation='softmax')(dense_layer_2)

model = Model(inputs=input_layer, outputs=output)
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['acc'])

Regression

input_layer = Input(shape=(X.shape[1],))
dense_layer_1 = Dense(100, activation='relu')(input_layer)
dense_layer_2 = Dense(50, activation='relu')(dense_layer_1)
dense_layer_3 = Dense(25, activation='relu')(dense_layer_2)
output = Dense(1)(dense_layer_3)

model = Model(inputs=input_layer, outputs=output)
model.compile(loss="mean_squared_error" , optimizer="adam", metrics=["mean_squared_error"])
tahwaru commented 3 years ago

Thanks!!! Let me close the issue at the moment. I will run some tests and get back to you.