keras-team / keras

Deep Learning for humans
http://keras.io/
Apache License 2.0
61.93k stars 19.46k forks source link

Is it possible to merge two different input layers into one? #95

Closed stephenjia closed 9 years ago

stephenjia commented 9 years ago

For example, How to concatenate image embedding and word embeddings together as one sequence which is then fed to LSTM just like what is done in Google's image caption generation paper http://arxiv.org/pdf/1411.4555v2.pdf ?

fchollet commented 9 years ago

Non-sequential models are not supported at this time. But caption generation can be implemented with a sequential model; for instance see: http://keras.io/examples/#architecture-for-learning-image-captions-with-a-convnet-and-a-gated-recurrent-unit

stephenjia commented 9 years ago

Thanks @fchollet. But I think jointly learning image embedding, word embedding and the neural language model is more natural.

stephenjia commented 9 years ago

I wrote a short code to try that. Here I want to concatenate two layers' output, lets say A (n by m) and B (n by t by m), A is 2d array and B is 3d array, and I want to combine them together by first reshapeing A to (n by 1 by m) and then concatenating them to get C (n by (t+1) by m) But there is a problem. Could you please give me some hints on this?

import theano import theano.tensor as T import numpy as np

from keras.layers.core import Layer

class MultiIncomingLayer(Layer):

incomings : a list of :class:Layer instances

def **init**(self, incoming_layers, axis, **kwargs):
    self.incoming_layers = [None if isinstance(incoming_layer, tuple)
                             else incoming_layer
                             for incoming_layer in incoming_layers]  
    self.axis = axis
    self.params = [p for p in incoming_layer.params for incoming_layer in incoming_layers]
def _concatenate(self,train):
    layer_inputs = [incoming_layer.output(train=train) for
                    incoming_layer in self.incoming_layers]
    #print layer_inputs[0].ndim, layer_inputs[1].ndim
    layer_inputs[0] = T.reshape(layer_inputs[0], (layer_inputs[0].shape[0], 1, layer_inputs[0].shape[1])) 
    new_input = T.concatenate(layer_inputs, axis=self.axis)
    #print new_input.ndim

def output(self, train):
    output = self._concatenate(train)
    return output
patyork commented 9 years ago

Preface: I have not read all of your code through, or tried to replicate it in my instance. But, the following is still applicable.

The Sequential model compiles layers in sequentially, thus the name. Therefore, if you have something along the lines of the below pseudocode:

model = Sequential
model.add(ImageEmbedding)
model.add(WordEmbedding)
model.add(MultiIncomingLayer)

Then the output of the ImageEmbedding layer is input for the WordEmbedding layer, etc. As such, your Concatenate layer would get the wrong dimensionality. Unless you modified the Sequential model (and did not provide that code) or created a new model type, we can't pinpoint the "problem"; not to mention the fact that you did not provide any info as to what the problem was.

I can give you some hits though, although I would not recommend jumping in:

stephenjia commented 9 years ago

Thanks @patyork Sorry that I forgot to provide the code that how I use this layer.

model = Sequential()

image embedding

img_emb_layer = Dense(feat_size, 256)

word embedding

word_emb_layer = Embedding(vocab_size, 256)

concatenate image embedding and word embeddings

img_sent_input_layer = MultiIncomingLayer([img_emb_layer, word_emb_layer], axis=1) model.add(img_sent_input_layer) model.add(Dropout(0.5))

LSTM

model.add(LSTM(256, 256, return_sequences=True)) # model.add(Dropout(0.5)) model.add(TimeDistributedDense(256, vocab_size, activation='softmax')) #

Besides, I also change some places in Sequential class, where I use both cnn feature and a sequence of words as two kinds of inputs

self.X = self.layers[0].incoming_layers[0].input # input of model self.Y = self.layers[0].incoming_layers[1].input

self._train = theano.function([self.X, self.Y], train_loss, updates=updates, allow_input_downcast=True)

stephenjia commented 9 years ago

The problem is that there is an error with the first Dropout layer, 'an AttributeError: 'NoneType' object has no attribute 'shape' in dropout layer' I donot know how this error comes, or maybe my idea about this implementation is not correct as you said. (I do not know much about how to debug the code when using a theano-based package)

patyork commented 9 years ago

'an AttributeError: 'NoneType' object has no attribute 'shape' in dropout layer'

Unless I missed a line of code, you did not call model.compile

stephenjia commented 9 years ago

@patyork I did call model.compile after adding TimeDistributedDense layer. model.compile(loss='categorical_crossentropy_3d', optimizer = rmsprop) where I extend the categorical_crossentropy to 3d case, but I think I can use mean_squared_error loss instead.

stephenjia commented 9 years ago

@patyork I notice that even with only the code

image embedding

img_emb_layer = Dense(feat_size, 256)

word embedding

word_emb_layer = Embedding(vocab_size, 256)

concatenate image embedding and word embeddings

img_sent_input_layer = MultiIncomingLayer([img_emb_layer, word_emb_layer], axis=1) model.add(img_sent_input_layer) I can not get what I want. I think that is because just as you said, I have to change the code for layer class and change the way that connect them together.

Sorry that I did not make my problem clear and bring you so much trouble

fchollet commented 9 years ago

Now possible through the use of the Merge layer; see: http://keras.io/layers/core/#merge

stephenjia commented 9 years ago

@fchollet Great, man. That is very helpful. Thanks!

lanzhuzhu commented 8 years ago

I had met the same problem. I want to merge two different input layers into one,and I defined the way how they merged. the problem is that I can't put the layer I defined to the model (Sequential or Graph) as the first layer. this occasion is really common in practical applications. I found that the only solution is to define your model with Keras functional API. It is rather complicated and I am waiting for some improvements in keras to solve the problem.

gunashekar commented 8 years ago

hello, I am new todeep learning and keras. I don't know if this is the correct place to post my question. I am trying to build an auto encoder with two inputs as defined in https://blog.keras.io/building-autoencoders-in-keras.html. The idea is build two seprate encoders , merge them and then use the output for the decoder part. But i am not sure how to use the fit function, for multiple inputs and single output. Any suggestions or references would be very helpful.

shay86 commented 6 years ago

I built a project using (Merge, merge) layer once, and use share layer in another. The problem I can't find any documentation that explains how merge or share layer works like convolution or max-pooling layers. I will be grateful if anyone can direct me or suggest some paper to me that helps me to understand how these layers work ???

gunashekar commented 6 years ago

Hi, can you be a bit clear and tell me which layer you want to merge? And where are you using the shared layer? And what exactly are you trying to do? May be i can help.

Regards- Deepa

On Sun 12. Nov 2017 at 23:48, shay86 notifications@github.com wrote:

I built a project using (Merge, merge) layer once, and use share layer in another. The problem I can't find any documentation that explains how merge or share layer works like convolution or max-pooling layers. I will be grateful if anyone can direct me or suggest some paper to me that helps me to understand how these layers work ???

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/keras/issues/95#issuecomment-343774963, or mute the thread https://github.com/notifications/unsubscribe-auth/AJE8Rb9W9y-OBs6QSVw8SwSG5pQOXCSoks5s13WigaJpZM4ENyHq .

shay86 commented 6 years ago

Hi,

Thanks for your kind reply. I build two convolution neural network to combine two sequences one with size (batch size, None, 23) and the other is (batch size, None, 3) note that batch size and None in both sequences are equal in each batch. This is the architecture

Convolution 1D layer (size=17,depth=256, padding=SAME,activation=prelu,input (batch size, None, 23)) Convolution 1D layer (size=16,depth=128, padding=SAME, activation=prelu) Convolution 1D layer (size=15 depth=64,padding=SAME, activation=prelu)

Convolution 1D layer (size=17,depth=256, padding=SAME,activation=preluinput (batch size, None, 3)) Convolution 1D layer (size=16,depth=128, padding=SAME,activation=prelu) Convolution 1D layer (size=15 depth=64,padding=SAME, activation=prelu)

Merge layer() Dropout layer (p=0.5) fully connected layer (units=4, activation=prelu)

for Share layer

Main Input X [batch, None, 23] Convolution 1D layer (size=24 depth=256, padding=SAME, activation=prelu) Auxiliary input Y [batch, None,3] Merge [input X, input Y] Convolution 1D layer (size=17 depth=256, padding=SAME, activation=prelu) Convolution 1D layer (size=16 depth=128, padding=SAME, activation=prelu) Convolution 1D layer (size=16 depth=64, padding=SAME, activation=prelu) Fully connected layer (units=4, activation=prelu)


From: gunashekar notifications@github.com Sent: Monday, November 13, 2017 5:20:32 AM To: fchollet/keras Cc: Shayan Ihsan Jalal; Comment Subject: Re: [fchollet/keras] Is it possible to merge two different input layers into one? (#95)

Hi, can you be a bit clear and tell me which layer you want to merge? And where are you using the shared layer? And what exactly are you trying to do? May be i can help.

Regards- Deepa

On Sun 12. Nov 2017 at 23:48, shay86 notifications@github.com wrote:

I built a project using (Merge, merge) layer once, and use share layer in another. The problem I can't find any documentation that explains how merge or share layer works like convolution or max-pooling layers. I will be grateful if anyone can direct me or suggest some paper to me that helps me to understand how these layers work ???

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/keras/issues/95#issuecomment-343774963, or mute the thread https://github.com/notifications/unsubscribe-auth/AJE8Rb9W9y-OBs6QSVw8SwSG5pQOXCSoks5s13WigaJpZM4ENyHq .

— You are receiving this because you commented. Reply to this email directly, view it on GitHubhttps://github.com/fchollet/keras/issues/95#issuecomment-343874506, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AarDXzRMQo2BWIJvp2ukF1VRJ9sZPS0Oks5s2BfwgaJpZM4ENyHq.

gunashekar commented 6 years ago

The basic Rule for Merge is that the outputs of the layers you want to merge should be of same dimensions( assuming you already know it)And then you specify the mode(https://keras.io/layers/merge/) based on the mode you set the axis.

An example would be Output= Merge( input1, input2, mode=‘concat’, axis=-1)

Hope this helps.

Regards- Deepa

On Mon 13. Nov 2017 at 18:08, shay86 notifications@github.com wrote:

Hi,

Thanks for your kind reply. I build two convolution neural network to combine two sequences one with size (batch size, None, 23) and the other is (batch size, None, 3) note that batch size and None in both sequences are equal in each batch. This is the architecture

Convolution 1D layer (size=17,depth=256, padding=SAME,activation=prelu,input (batch size, None, 23)) Convolution 1D layer (size=16,depth=128, padding=SAME, activation=prelu) Convolution 1D layer (size=15 depth=64,padding=SAME, activation=prelu)

Convolution 1D layer (size=17,depth=256, padding=SAME,activation=preluinput (batch size, None, 3)) Convolution 1D layer (size=16,depth=128, padding=SAME,activation=prelu) Convolution 1D layer (size=15 depth=64,padding=SAME, activation=prelu)

Merge layer() Dropout layer (p=0.5) fully connected layer (units=4, activation=prelu)

for Share layer

Main Input X [batch, None, 23] Convolution 1D layer (size=24 depth=256, padding=SAME, activation=prelu) Auxiliary input Y [batch, None,3] Merge [input X, input Y] Convolution 1D layer (size=17 depth=256, padding=SAME, activation=prelu) Convolution 1D layer (size=16 depth=128, padding=SAME, activation=prelu) Convolution 1D layer (size=16 depth=64, padding=SAME, activation=prelu) Fully connected layer (units=4, activation=prelu)


From: gunashekar notifications@github.com Sent: Monday, November 13, 2017 5:20:32 AM To: fchollet/keras Cc: Shayan Ihsan Jalal; Comment Subject: Re: [fchollet/keras] Is it possible to merge two different input layers into one? (#95)

Hi, can you be a bit clear and tell me which layer you want to merge? And where are you using the shared layer? And what exactly are you trying to do? May be i can help.

Regards- Deepa

On Sun 12. Nov 2017 at 23:48, shay86 notifications@github.com wrote:

I built a project using (Merge, merge) layer once, and use share layer in another. The problem I can't find any documentation that explains how merge or share layer works like convolution or max-pooling layers. I will be grateful if anyone can direct me or suggest some paper to me that helps me to understand how these layers work ???

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/keras/issues/95#issuecomment-343774963, or mute the thread < https://github.com/notifications/unsubscribe-auth/AJE8Rb9W9y-OBs6QSVw8SwSG5pQOXCSoks5s13WigaJpZM4ENyHq

.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub< https://github.com/fchollet/keras/issues/95#issuecomment-343874506>, or mute the thread< https://github.com/notifications/unsubscribe-auth/AarDXzRMQo2BWIJvp2ukF1VRJ9sZPS0Oks5s2BfwgaJpZM4ENyHq

.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/keras/issues/95#issuecomment-343987307, or mute the thread https://github.com/notifications/unsubscribe-auth/AJE8RSyrFms6dfJA7jCI0ipMZWoBlC3mks5s2HdxgaJpZM4ENyHq .

shay86 commented 6 years ago

Hi, your answer helps a lot but can you provide me with any source, document, or paper explains marge layer more. As far as I know, there are (Merge and merge) one of them combine tensors and the other combine weights .How they do that what is their architecture look like or what is the mathematic formula for them. I search a lot of papers none of them gives me an answer. So, I will be grateful if you direct me to some sources that specifically related to merge layers. my best regards


From: gunashekar notifications@github.com Sent: Monday, November 13, 2017 1:31:10 PM To: fchollet/keras Cc: Shayan Ihsan Jalal; Comment Subject: Re: [fchollet/keras] Is it possible to merge two different input layers into one? (#95)

The basic Rule for Merge is that the outputs of the layers you want to merge should be of same dimensions( assuming you already know it)And then you specify the mode(https://keras.io/layers/merge/) based on the mode you set the axis.

An example would be Output= Merge( input1, input2, mode=‘concat’, axis=-1)

Hope this helps.

Regards- Deepa

On Mon 13. Nov 2017 at 18:08, shay86 notifications@github.com wrote:

Hi,

Thanks for your kind reply. I build two convolution neural network to combine two sequences one with size (batch size, None, 23) and the other is (batch size, None, 3) note that batch size and None in both sequences are equal in each batch. This is the architecture

Convolution 1D layer (size=17,depth=256, padding=SAME,activation=prelu,input (batch size, None, 23)) Convolution 1D layer (size=16,depth=128, padding=SAME, activation=prelu) Convolution 1D layer (size=15 depth=64,padding=SAME, activation=prelu)

Convolution 1D layer (size=17,depth=256, padding=SAME,activation=preluinput (batch size, None, 3)) Convolution 1D layer (size=16,depth=128, padding=SAME,activation=prelu) Convolution 1D layer (size=15 depth=64,padding=SAME, activation=prelu)

Merge layer() Dropout layer (p=0.5) fully connected layer (units=4, activation=prelu)

for Share layer

Main Input X [batch, None, 23] Convolution 1D layer (size=24 depth=256, padding=SAME, activation=prelu) Auxiliary input Y [batch, None,3] Merge [input X, input Y] Convolution 1D layer (size=17 depth=256, padding=SAME, activation=prelu) Convolution 1D layer (size=16 depth=128, padding=SAME, activation=prelu) Convolution 1D layer (size=16 depth=64, padding=SAME, activation=prelu) Fully connected layer (units=4, activation=prelu)


From: gunashekar notifications@github.com Sent: Monday, November 13, 2017 5:20:32 AM To: fchollet/keras Cc: Shayan Ihsan Jalal; Comment Subject: Re: [fchollet/keras] Is it possible to merge two different input layers into one? (#95)

Hi, can you be a bit clear and tell me which layer you want to merge? And where are you using the shared layer? And what exactly are you trying to do? May be i can help.

Regards- Deepa

On Sun 12. Nov 2017 at 23:48, shay86 notifications@github.com wrote:

I built a project using (Merge, merge) layer once, and use share layer in another. The problem I can't find any documentation that explains how merge or share layer works like convolution or max-pooling layers. I will be grateful if anyone can direct me or suggest some paper to me that helps me to understand how these layers work ???

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/keras/issues/95#issuecomment-343774963, or mute the thread < https://github.com/notifications/unsubscribe-auth/AJE8Rb9W9y-OBs6QSVw8SwSG5pQOXCSoks5s13WigaJpZM4ENyHq

.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub< https://github.com/fchollet/keras/issues/95#issuecomment-343874506>, or mute the thread< https://github.com/notifications/unsubscribe-auth/AarDXzRMQo2BWIJvp2ukF1VRJ9sZPS0Oks5s2BfwgaJpZM4ENyHq

.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/keras/issues/95#issuecomment-343987307, or mute the thread https://github.com/notifications/unsubscribe-auth/AJE8RSyrFms6dfJA7jCI0ipMZWoBlC3mks5s2HdxgaJpZM4ENyHq .

— You are receiving this because you commented. Reply to this email directly, view it on GitHubhttps://github.com/fchollet/keras/issues/95#issuecomment-344012990, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AarDX_kFGutnWoRB7267nJOb-bu9IdwEks5s2IrtgaJpZM4ENyHq.

gunashekar commented 6 years ago

Hi, I haven’t come across any paper decribing the mathematical or architecture details on how the merge layer works. You can figure it out by looking at the source code and by printing the output shape of your network. (This helped me a lot)

The Merge layer is used to combine the the outputs of two sequential models. ( this worked when i tried to combine the output of two sequential CNN’s For implementing a GAN)

And the merge layer is used merge the output of two layers in a functional API (i used this two combine the output of maxpool layer with the upsampling layer within the same network ref: unet architecture)

A better example and clear explanation Can be found here https://github.com/fchollet/keras/issues/3921

PS: I worked on the older version of keras and the documentation had very good examples. Things seem to have changed a lot now. You can try looking for the older documentations. You can look at the inception architecture from the keras documentation and the unet architecture https://github.com/orobix/retina-unet for examples on how to implement it.

Hope this helps.

Regards- Deepa

On Tue 14. Nov 2017 at 18:13, shay86 notifications@github.com wrote:

Hi, your answer helps a lot but can you provide me with any source, document, or paper explains marge layer more. As far as I know, there are (Merge and merge) one of them combine tensors and the other combine weights .How they do that what is their architecture look like or what is the mathematic formula for them. I search a lot of papers none of them gives me an answer. So, I will be grateful if you direct me to some sources that specifically related to merge layers. my best regards


From: gunashekar notifications@github.com Sent: Monday, November 13, 2017 1:31:10 PM To: fchollet/keras Cc: Shayan Ihsan Jalal; Comment Subject: Re: [fchollet/keras] Is it possible to merge two different input layers into one? (#95)

The basic Rule for Merge is that the outputs of the layers you want to merge should be of same dimensions( assuming you already know it)And then you specify the mode(https://keras.io/layers/merge/) based on the mode you set the axis.

An example would be Output= Merge( input1, input2, mode=‘concat’, axis=-1)

Hope this helps.

Regards- Deepa

On Mon 13. Nov 2017 at 18:08, shay86 notifications@github.com wrote:

Hi,

Thanks for your kind reply. I build two convolution neural network to combine two sequences one with size (batch size, None, 23) and the other is (batch size, None, 3) note that batch size and None in both sequences are equal in each batch. This is the architecture

Convolution 1D layer (size=17,depth=256, padding=SAME,activation=prelu,input (batch size, None, 23)) Convolution 1D layer (size=16,depth=128, padding=SAME, activation=prelu) Convolution 1D layer (size=15 depth=64,padding=SAME, activation=prelu)

Convolution 1D layer (size=17,depth=256, padding=SAME,activation=preluinput (batch size, None, 3)) Convolution 1D layer (size=16,depth=128, padding=SAME,activation=prelu) Convolution 1D layer (size=15 depth=64,padding=SAME, activation=prelu)

Merge layer() Dropout layer (p=0.5) fully connected layer (units=4, activation=prelu)

for Share layer

Main Input X [batch, None, 23] Convolution 1D layer (size=24 depth=256, padding=SAME, activation=prelu) Auxiliary input Y [batch, None,3] Merge [input X, input Y] Convolution 1D layer (size=17 depth=256, padding=SAME, activation=prelu) Convolution 1D layer (size=16 depth=128, padding=SAME, activation=prelu) Convolution 1D layer (size=16 depth=64, padding=SAME, activation=prelu) Fully connected layer (units=4, activation=prelu)


From: gunashekar notifications@github.com Sent: Monday, November 13, 2017 5:20:32 AM To: fchollet/keras Cc: Shayan Ihsan Jalal; Comment Subject: Re: [fchollet/keras] Is it possible to merge two different input layers into one? (#95)

Hi, can you be a bit clear and tell me which layer you want to merge? And where are you using the shared layer? And what exactly are you trying to do? May be i can help.

Regards- Deepa

On Sun 12. Nov 2017 at 23:48, shay86 notifications@github.com wrote:

I built a project using (Merge, merge) layer once, and use share layer in another. The problem I can't find any documentation that explains how merge or share layer works like convolution or max-pooling layers. I will be grateful if anyone can direct me or suggest some paper to me that helps me to understand how these layers work ???

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/keras/issues/95#issuecomment-343774963, or mute the thread <

https://github.com/notifications/unsubscribe-auth/AJE8Rb9W9y-OBs6QSVw8SwSG5pQOXCSoks5s13WigaJpZM4ENyHq

.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub< https://github.com/fchollet/keras/issues/95#issuecomment-343874506>, or mute the thread<

https://github.com/notifications/unsubscribe-auth/AarDXzRMQo2BWIJvp2ukF1VRJ9sZPS0Oks5s2BfwgaJpZM4ENyHq

.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/keras/issues/95#issuecomment-343987307, or mute the thread < https://github.com/notifications/unsubscribe-auth/AJE8RSyrFms6dfJA7jCI0ipMZWoBlC3mks5s2HdxgaJpZM4ENyHq

.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub< https://github.com/fchollet/keras/issues/95#issuecomment-344012990>, or mute the thread< https://github.com/notifications/unsubscribe-auth/AarDX_kFGutnWoRB7267nJOb-bu9IdwEks5s2IrtgaJpZM4ENyHq

.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/keras/issues/95#issuecomment-344329368, or mute the thread https://github.com/notifications/unsubscribe-auth/AJE8RR7crqfVlUGhjFBieKIjlLYQw4ozks5s2cokgaJpZM4ENyHq .

shay86 commented 6 years ago

Hi,

that is good I will try to do as you say and I will study the links you sent

thanks a lot


From: gunashekar notifications@github.com Sent: Tuesday, November 14, 2017 1:59 PM To: fchollet/keras Cc: Shayan Ihsan Jalal; Comment Subject: Re: [fchollet/keras] Is it possible to merge two different input layers into one? (#95)

Hi, I haven’t come across any paper decribing the mathematical or architecture details on how the merge layer works. You can figure it out by looking at the source code and by printing the output shape of your network. (This helped me a lot)

The Merge layer is used to combine the the outputs of two sequential models. ( this worked when i tried to combine the output of two sequential CNN’s For implementing a GAN)

And the merge layer is used merge the output of two layers in a functional API (i used this two combine the output of maxpool layer with the upsampling layer within the same network ref: unet architecture)

A better example and clear explanation Can be found here https://github.com/fchollet/keras/issues/3921

PS: I worked on the older version of keras and the documentation had very good examples. Things seem to have changed a lot now. You can try looking for the older documentations. You can look at the inception architecture from the keras documentation and the unet architecture https://github.com/orobix/retina-unet for examples on how to implement it.

Hope this helps.

Regards- Deepa

On Tue 14. Nov 2017 at 18:13, shay86 notifications@github.com wrote:

Hi, your answer helps a lot but can you provide me with any source, document, or paper explains marge layer more. As far as I know, there are (Merge and merge) one of them combine tensors and the other combine weights .How they do that what is their architecture look like or what is the mathematic formula for them. I search a lot of papers none of them gives me an answer. So, I will be grateful if you direct me to some sources that specifically related to merge layers. my best regards


From: gunashekar notifications@github.com Sent: Monday, November 13, 2017 1:31:10 PM To: fchollet/keras Cc: Shayan Ihsan Jalal; Comment Subject: Re: [fchollet/keras] Is it possible to merge two different input layers into one? (#95)

The basic Rule for Merge is that the outputs of the layers you want to merge should be of same dimensions( assuming you already know it)And then you specify the mode(https://keras.io/layers/merge/) based on the mode you set the axis.

An example would be Output= Merge( input1, input2, mode=‘concat’, axis=-1)

Hope this helps.

Regards- Deepa

On Mon 13. Nov 2017 at 18:08, shay86 notifications@github.com wrote:

Hi,

Thanks for your kind reply. I build two convolution neural network to combine two sequences one with size (batch size, None, 23) and the other is (batch size, None, 3) note that batch size and None in both sequences are equal in each batch. This is the architecture

Convolution 1D layer (size=17,depth=256, padding=SAME,activation=prelu,input (batch size, None, 23)) Convolution 1D layer (size=16,depth=128, padding=SAME, activation=prelu) Convolution 1D layer (size=15 depth=64,padding=SAME, activation=prelu)

Convolution 1D layer (size=17,depth=256, padding=SAME,activation=preluinput (batch size, None, 3)) Convolution 1D layer (size=16,depth=128, padding=SAME,activation=prelu) Convolution 1D layer (size=15 depth=64,padding=SAME, activation=prelu)

Merge layer() Dropout layer (p=0.5) fully connected layer (units=4, activation=prelu)

for Share layer

Main Input X [batch, None, 23] Convolution 1D layer (size=24 depth=256, padding=SAME, activation=prelu) Auxiliary input Y [batch, None,3] Merge [input X, input Y] Convolution 1D layer (size=17 depth=256, padding=SAME, activation=prelu) Convolution 1D layer (size=16 depth=128, padding=SAME, activation=prelu) Convolution 1D layer (size=16 depth=64, padding=SAME, activation=prelu) Fully connected layer (units=4, activation=prelu)


From: gunashekar notifications@github.com Sent: Monday, November 13, 2017 5:20:32 AM To: fchollet/keras Cc: Shayan Ihsan Jalal; Comment Subject: Re: [fchollet/keras] Is it possible to merge two different input layers into one? (#95)

Hi, can you be a bit clear and tell me which layer you want to merge? And where are you using the shared layer? And what exactly are you trying to do? May be i can help.

Regards- Deepa

On Sun 12. Nov 2017 at 23:48, shay86 notifications@github.com wrote:

I built a project using (Merge, merge) layer once, and use share layer in another. The problem I can't find any documentation that explains how merge or share layer works like convolution or max-pooling layers. I will be grateful if anyone can direct me or suggest some paper to me that helps me to understand how these layers work ???

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/keras/issues/95#issuecomment-343774963, or mute the thread <

https://github.com/notifications/unsubscribe-auth/AJE8Rb9W9y-OBs6QSVw8SwSG5pQOXCSoks5s13WigaJpZM4ENyHq

.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub< https://github.com/fchollet/keras/issues/95#issuecomment-343874506>, or mute the thread<

https://github.com/notifications/unsubscribe-auth/AarDXzRMQo2BWIJvp2ukF1VRJ9sZPS0Oks5s2BfwgaJpZM4ENyHq

.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/keras/issues/95#issuecomment-343987307, or mute the thread < https://github.com/notifications/unsubscribe-auth/AJE8RSyrFms6dfJA7jCI0ipMZWoBlC3mks5s2HdxgaJpZM4ENyHq

.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub< https://github.com/fchollet/keras/issues/95#issuecomment-344012990>, or mute the thread< https://github.com/notifications/unsubscribe-auth/AarDX_kFGutnWoRB7267nJOb-bu9IdwEks5s2IrtgaJpZM4ENyHq

.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/keras/issues/95#issuecomment-344329368, or mute the thread https://github.com/notifications/unsubscribe-auth/AJE8RR7crqfVlUGhjFBieKIjlLYQw4ozks5s2cokgaJpZM4ENyHq .

— You are receiving this because you commented. Reply to this email directly, view it on GitHubhttps://github.com/fchollet/keras/issues/95#issuecomment-344361480, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AarDX3nm7qSHj2mGwXcyx2onO0wINu9Iks5s2eMvgaJpZM4ENyHq.

MehaBala commented 6 years ago

I am looking for a recommendation to merge vector information with CNN output into a regression model.