Closed 14H034160212 closed 2 years ago
I fixed the issue by doing the following updating after line 81. https://github.com/nuric/deeplogic/blob/master/models/ima.py#L81
I got the idea from the previous commit. https://github.com/nuric/deeplogic/commit/e8bf8057b9ace1ab68425a1bfc150aa1045a8c17#diff-016d6e832b13ebc815d53c6aff0f0fc43517a668bcb426ccd94bd9edc12e0410L149
# Unify every rule and weighted sum based on attention
new_states = unifier(embedded_ctx_preds, initial_state=[state])
# (?, rules, dim)
new_state = dot11([sim_vec, new_states])
# Apply gating
gate = gating(state)
outs.append(gate)
new_state = gate2([state, new_state, gate])
state = new_state
I also updated another way to add gating, which can be trained coverage in my case.
# Unify every rule and weighted sum based on attention
new_states = unifier(embedded_ctx_preds, initial_state=[state])
# (?, rules, dim)
new_state = dot11([sim_vec, new_states])
# Apply gating
gate = gating(new_state)
outs.append(gate)
new_state = gate2([new_state, state, gate])
state = new_state
More related work can be referred from that link. https://github.com/Strong-AI-Lab/A-Neural-Symbolic-Paradigm
Hi,
Here is a problem about the incompatible with layer when I use the gate2 function. It is quite weird if I change the code into this way. The bug will disappear. But if I do this way, the new_states might not be updated. Do you have any idea to solve the problem? Thanks a lot. Here is the bug screenshot.