kimhc6028 / relational-networks

Pytorch implementation of "A simple neural network module for relational reasoning" (Relational Networks)
https://arxiv.org/pdf/1706.01427.pdf
BSD 3-Clause "New" or "Revised" License
812 stars 160 forks source link

Text relational preprocessing #2

Closed alphamupsiomega closed 7 years ago

alphamupsiomega commented 7 years ago

How would one preprocess the bAbI tasks text into supposed objects and relations for loading into this training model?

The original paper does not seem entirely clear to me:

Dealing with natural language For the bAbI suite of tasks the natural language inputs must be transformed into a set of objects. This is a distinctly di↵erent requirement from visual QA, where objects were defined as spatially distinct regions in convolved feature maps. So, we first identified up to 20 sentences in the support set that were immediately prior to the probe question. Then, we tagged these sentences with labels indicating their relative position in the support set, and processed each sentence word-by-word with an LSTM (with the same LSTM acting on each sentence independently).

kimhc6028 commented 7 years ago

Currently I am not implementing bAbI task but Sort-of-CLEVR task which does not requires natural language processing. Frankly speaking I know little of bAbI task, as I even didn't have seen actual dataset. Sorry for poor answers.