richliao / textClassifier

Text classifier for Hierarchical Attention Networks for Document Classification
Apache License 2.0
1.07k stars 379 forks source link

Getting error on TimeDistributed() #48

Open deepu0991 opened 1 year ago

deepu0991 commented 1 year ago

hello I am getting following error on TimeDistributed() layer while running the code in colab, please help me

WARNING:tensorflow:The following Variables were used in a Lambda layer's call (tf.nn.bias_add), but are not present in its tracked objects: <tf.Variable 'att_layer_2/b:0' shape=(100,) dtype=float32>. This is a strong indication that the Lambda layer should be rewritten as a subclassed Layer.

NotImplementedError Traceback (most recent call last)

in 6 7 review_input = Input(shape=(MAX_SENTS, MAX_SENT_LENGTH), dtype='int32') ----> 8 review_encoder = TimeDistributed(sentEncoder)(review_input) 9 l_lstm_sent = Bidirectional(GRU(100, return_sequences=True))(review_encoder) 10 l_att_sent = AttLayer(100)(l_lstm_sent) 1 frames /usr/local/lib/python3.7/dist-packages/keras/engine/base_layer.py in compute_output_shape(self, input_shape) 840 raise NotImplementedError( 841 'Please run in eager mode or implement the `compute_output_shape` ' --> 842 'method on your layer (%s).' % self.__class__.__name__) 843 844 @doc_controls.for_subclass_implementers NotImplementedError: Exception encountered when calling layer "time_distributed" (type TimeDistributed). Please run in eager mode or implement the `compute_output_shape` method on your layer (TFOpLambda). Call arguments received: • inputs=tf.Tensor(shape=(None, 15, 100), dtype=int32) • training=None • mask=None