Open davidsvaughn opened 7 years ago
I implement a simplified keras version here. The training is also very slow. I found the slowest strategy is Maxpooling-Matching
where "each forward (or backward) contextual embedding is compared with every forward (or backward) contextual embeddings of the other sentence". You could check it out by commenting each strategy in multi_perspective.py
.
In my keras version, the warning is caused by tf.gather()
. But I don't think this is the reason of slowness.
@davidsvaughn Hi, I notice that you said the training is very slow. Can you tell me the general training time of one epoch? Thank you.
how to solve the problem?
Training works, but it seems very slow... which is fine, as long as this is the expected behavior. I'm just curious if it is unusually slow for me.... did you happen to get this warning?
If not, then I'm wondering if my code is running slow due to this message... So far, I know it's coming from the
tf.gradients()
function call in line 220 of SentenceMatchModelGraph.py:Unfortunately, this doesn't help much because I still don't know which part of the network is triggering it...
If you tell me that your are not getting this message, then I will investigate further and try to find the root cause. Thanks!