mattjshannon / swsnet

Training neural networks with the Sloan SWS astronomical dataset.
Other
0 stars 0 forks source link

Keras model choice #3

Open mattjshannon opened 6 years ago

mattjshannon commented 6 years ago

How to motivate the choice of layers/activation functions? Just try, or any innate reason to choose one way or another? Note there are 7 categories of labels for the first classifier I'm testing ("group"). 1239 samples, each containing a ~350-point vector.

Currently using:

model = keras.Sequential()
model.add(keras.layers.Dense(64, activation='relu'))
model.add(keras.layers.Dense(64, activation='relu'))
model.add(keras.layers.Dense(64, activation='relu'))
model.add(keras.layers.Dense(32, activation='relu'))
model.add(keras.layers.Dense(7, activation='softmax'))

And for our categorical data, compiling with:

model.compile(optimizer=tf.train.AdamOptimizer(0.0005),
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])
mattjshannon commented 6 years ago

Questions:

Current state:

# Sequential model, 7 classes of output.# Sequent 
model = keras.Sequential()
model.add(keras.layers.Dense(64, activation='relu', input_dim=359))
model.add(keras.layers.Dropout(0.5))
model.add(keras.layers.Dense(64, activation='relu'))
model.add(keras.layers.Dropout(0.5))
model.add(keras.layers.Dense(64, activation='relu'))
model.add(keras.layers.Dropout(0.5))
model.add(keras.layers.Dense(7, activation='softmax'))

# Early stopping condition.
callback = [tf.keras.callbacks.EarlyStopping(monitor='acc', patience=4, verbose=0)]

# Recompile model and fit.
model.compile(optimizer=tf.train.AdamOptimizer(0.0005),
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])