shijx12 / TransferNet

Pytorch implementation of EMNLP 2021 paper "TransferNet: An Effective and Transparent Framework for Multi-hop Question Answering over Relation Graph "
63 stars 18 forks source link

Question about inference speed #8

Open mug2mag opened 2 years ago

mug2mag commented 2 years ago

@shijx12 hi Thanks for sharing the great job. I think the model proposed in your project may have a high complexity of computation and large memory storage problems because you compute the probability of an entity being activated as the answer entity for "num_step" times. And it would also affect the inference speed.

Have you ever thought about the problems mentioned above and tested the response latency for a single question?

Any reply would be appreciated.