marcotcr / lime

Lime: Explaining the predictions of any machine learning classifier
BSD 2-Clause "Simplified" License
11.54k stars 1.8k forks source link

How to explain seq2seq model using lime? #719

Open noobanti opened 1 year ago

noobanti commented 1 year ago

`class AttentionGRUModel: def init(self, config):

    # override default tdatlen
    config['tdatlen'] = 50

    self.config = config
    self.tdatvocabsize = config['tdatvocabsize']
    self.comvocabsize = config['comvocabsize']
    self.datlen = config['tdatlen']
    self.comlen = config['comlen']

    self.embdims = 100
    self.recdims = 256

    self.config['num_input'] = 2
    self.config['num_output'] = 1

def create_model(self):
    dat_input = Input(shape=(self.datlen,))
    com_input = Input(shape=(self.comlen,))
    ee = Embedding(output_dim=self.embdims, input_dim=self.tdatvocabsize, mask_zero=False)(dat_input)
    #enc = GRU(self.recdims, return_state=True, return_sequences=True, activation='tanh', unroll=True)
    enc = CuDNNGRU(self.recdims, return_state=True, return_sequences=True)
    encout, state_h = enc(ee)
    de = Embedding(output_dim=self.embdims, input_dim=self.comvocabsize, mask_zero=False)(com_input)
    #dec = GRU(self.recdims, return_sequences=True, activation='tanh', unroll=True)
    dec = CuDNNGRU(self.recdims, return_sequences=True)
    decout = dec(de, initial_state=state_h)
    attn = dot([decout, encout], axes=[2, 2])
    attn = Activation('softmax')(attn)
    context = dot([attn, encout], axes=[2,1])
    context = concatenate([context, decout])
    out = TimeDistributed(Dense(self.recdims, activation="tanh"))(context)
    out = Flatten()(out)
    out = Dense(self.comvocabsize, activation="softmax")(out)
    model = Model(inputs=[dat_input, com_input], outputs=out)

    if self.config['multigpu']:
        model = keras.utils.multi_gpu_model(model, gpus=2)

    model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
    return self.config, model`

input example: void a(){ printf("hello world") } output: this is a print function

How to explain seq2seq model using lime? can you give me some example which explain seq2seq medel using lime!

think!