pplonski / keras2cpp

This is a bunch of code to port Keras neural network model into pure C++.
MIT License
679 stars 153 forks source link

model with function API #16

Open blackarrow3542 opened 7 years ago

blackarrow3542 commented 7 years ago

Hi this is really great work! I just want to provide something I find might be useful to others. I find that in order to dump model correctly. We need to build model with sequential model and add Activation layer separately. For example, the second method will get dumped correctly. While for the first method, dumped model has no Activation layer.

from keras.models import Sequential, Model
from keras.layers import Input, Dense, Dropout, Activation
def get_model_by_sequential():
        model = Sequential()
        model.add(Dense(64,input_dim=15,init = 'uniform',activation='relu'))
        model.add(Dense(128,init = 'uniform',activation='relu'))
        model.add(Dense(256,init = 'uniform',activation='relu'))
        model.add(Dense(1,activation='sigmoid'))
        return model

def get_model_by_sequential_with_separate_activation():
        model = Sequential()
        model.add(Dense(64,input_dim=15,init = 'uniform'))
        model.add(Activation('relu'))
        model.add(Dense(128,init = 'uniform'))
        model.add(Activation('relu'))
        model.add(Dense(256,init = 'uniform'))
        model.add(Activation('relu'))
        model.add(Dense(1))
        model.add(Activation('sigmoid'))
        return model

def get_model_by_functional_API():
        a = Input(shape=(15,))
        b = Dense(64,input_dim=15,init = 'uniform',activation='relu')(a)
        b = Dense(128,init = 'uniform',activation='relu')(b)
        b = Dense(256,init = 'uniform',activation='relu')(b)
        b = Dense(1,activation='sigmoid')(b)
        model = Model(input=a, output=b)
        return model
pplonski commented 7 years ago

Thanks for that information! I think it should be easy to handle both situations. Would you like to prepare changes for it?

blackarrow3542 commented 7 years ago

Hi, I will work on that. I also added 'sigmoid' Activation layer for binary classification.