lucko515 / chatbot-startkit

This repository holds files for the simple chatbot wrote in TensorFlow 1.4, with attention mechanism and bucketing.
59 stars 38 forks source link

Indentation Error #5

Open sourcecode369 opened 5 years ago

suvidh commented 5 years ago

I get this indentation error:- image

sourcecode369 commented 5 years ago

Yes Ive got the same error. I guess there is some problem with this implementation. So I had to reimplement it all on my own.

vishal2612200 commented 5 years ago

This error is generated due to unwanted starting space in some comments. Reduce the starting space of all comments to 4 spaces. Your problem will get solved. ` def encoder(inputs, rnn_size, number_of_layers, encoder_seq_len, keep_probs, encoder_embed_size, encoder_vocab_size):

'''
    Used to define encoder of the seq2seq model (The encoder is made of simple dynamic RNN network).

    Inputs:
        inputs -
        rnn_siz - number of units in the RNN layer
        number_of_layer - number of RNN layers that the model uses
        encoder_seq_len - vector of lengths (got from placeholder)
        keep_probs - dropout rate
        encoder_embed_size - size of embedding vector for encoder part
        encoder_vocab_size - number of different words that the model uses in a vocabulary

    Outputs:
        encoder_outputs -
        encoder_states - internal states from the RNN layer(s)

'''

def cell(units, rate):
    layer = tf.contrib.rnn.BasicLSTMCell(units)
    return tf.contrib.rnn.DropoutWrapper(layer,rate)

encoder_cell = tf.contrib.rnn.MultiRNNCell([cell(rnn_size, keep_probs) for _ in range(number_of_layers)])

encoder_embedings = tf.contrib.layers.embed_sequence(inputs, encoder_vocab_size, encoder_embed_size) #used to create embeding layer for the encoder

encoder_outputs, encoder_states = tf.nn.dynamic_rnn(encoder_cell,encoder_embedings, encoder_seq_len,dtype=tf.float32)

return encoder_outputs, encoder_states

` This function has 4 space in the starting of the comment. make other function comments also like this function

vishal2612200 commented 5 years ago

@lucko515 you can close this now. It has been solved now.