albermax / innvestigate

A toolbox to iNNvestigate neural networks' predictions!
Other
1.27k stars 233 forks source link

This is not supposed to happen! exceptions for siamese networks #142

Open RZachLamberty opened 5 years ago

RZachLamberty commented 5 years ago

when constructing a siamese network, I am necessarily creating an embedding sub-network that I will use twice. receiving two inputs, I pass each through that single embedding network and then combine two embeddings with a Subtract layer. a rough pseudo-code example

embedding_model = Sequential()
embedding_model = Dense(256, activation='relu')(embedding_model)
embedding_model = Dense(8, activation='relu')(embedding_model)

input_1 = Input((N,))
input_2 = Input((N,))
embedding_1 = embedding_model(input_1)
embedding_2 = embedding_model(input_2)

delta = Subtract()([embedding_1, embedding_2])

prediction = Dense(1, activation='sigmoid')(delta)

model = Model(inputs=[input_1, input_2], outputs=prediction)

model.compile(
    loss='binary_crossentropy',
    optimizer='adam',
    metrics=['accuracy', 'binary_crossentropy']
)

perhaps there are multiple things going wrong here that result in a This is not supposed to happen! exception. for what it's worth, I've seen from other issues (e.g. https://github.com/albermax/innvestigate/issues/97) that using pre-built models as components of the model object passed to innvestigate.create_analyzer can lead to a "This is not supposed to happen!" exception. the fix in that issue is to avoid creating these model as a separate layer. perhaps I am running in to the same issue?

RZachLamberty commented 5 years ago

I should clarify that this is an embedding in the general sense -- a learned representation -- and not an application of a keras embedding layer type

albermax commented 5 years ago

Hi,

yes, you are right the same issue as with #97 which is unfortunately still not fixed. So you can reuse the layers, but "applying" a model like:

embedding_1 = embedding_model(input_1)

will not work at the moment. Sorry for the inconvenience.

Does this help you?

Cheers, Max

RZachLamberty commented 5 years ago

I have not yet been able to implement the same sort of fix, namely something like using .get_output_at to pull the output tensors from those models. that is, instead of

embedding_1 = embedding_model(input_1)
embedding_2 = embedding_model(input_2)

delta = Subtract()([embedding_1, embedding_2])

trying something like

embedding_1 = embedding_model(input_1)
embedding_2 = embedding_model(input_2)

last_embedding_1_layer = embedding_1.get_layer(lname1)
last_embedding_2_layer = embedding_2.get_layer(lname2)

delta = Subtract([last_embedding_1_layer.get_output_at(0), last_embedding_2_layer.get_output_at(0)])

is that what you are suggesting?

albermax commented 5 years ago

Hmm, no. I had something like this in mind:

embedding_model_ = Sequential()
embedding_model_ = Dense(256, activation='relu')(embedding_model_)
embedding_model_ = Dense(8, activation='relu')(embedding_model_)

def embedding_model(input):
    tmp = embedding_model_.layers[0](input)
    tmp = embedding_model_.layers[1](input)
    return tmp

input_1 = Input((N,))
input_2 = Input((N,))
embedding_1 = embedding_model(input_1)
embedding_2 = embedding_model(input_2)

delta = Subtract()([embedding_1, embedding_2])

prediction = Dense(1, activation='sigmoid')(delta)

model = Model(inputs=[input_1, input_2], outputs=prediction)

model.compile(
    loss='binary_crossentropy',
    optimizer='adam',
    metrics=['accuracy', 'binary_crossentropy']
)

I did not test it. Hope it works.

RZachLamberty commented 5 years ago

one complication: how can I recycle the learned weights from these layers? I think that the learned weights from the embedding_model will "come along" by default in your proposed method, but what about those for the Dense layer?

albermax commented 5 years ago

Your final model should contain all relevant layers. When you create a Model with inputs and outputs, Keras traces the execution graph and collects all weights that can be learned.