allenai / deep_qa

A deep NLP library, based on Keras / tf, focused on question answering (but useful for other NLP too)
Apache License 2.0
404 stars 133 forks source link

Remove memory networks #389

Closed DeNeutoy closed 7 years ago

DeNeutoy commented 7 years ago

There are a few more things that I could remove - e.g there are some fairly niche memory_network specific layers which could probably be removed, such as the memory_updaters and knowledge_selectors etc.

matt-gardner commented 7 years ago

I would also remove the niche layers - there's no reason to keep them around, and a lot of their functionality has now been implemented in better ways (e.g., VectorMatrixSplit, Attention, etc.). It'd be pretty trivial to re-implement a memory network (or any of the variants deleted here) in our current code, and it would be much better designed that what you're deleting.

matt-gardner commented 7 years ago

Just looking over your branch for files that should be included: all layers/knowledge_* files, layers/memory_updaters.py, layers/top_knowledge_selector.py, layers/entailment_models/encoded_sentence.py.

The whole layers/entailment_models/ module should probably also go, but we can do that one in a separate PR. I think we should re-implement the decomposable attention piece as a series of layers, instead of as a single layer - I think it'd be easier to understand the code, and we have implemented basically all of the necessary layers to do that, so it should be pretty trivial. I could be persuaded otherwise, though...

DeNeutoy commented 7 years ago

@matt-gardner I kept the AttentiveGRU because it's fairly general and not actually connected to the memory network stuff, but apart from that this is ready.