uchicago-computation-workshop / nicolas_masse

Repository for Nicolas Masse's presentation at the CSS Workshop (1/13/2019)
0 stars 0 forks source link

Uses in Natural Language Processing? And code for the model? #8

Open bhargavvader opened 5 years ago

bhargavvader commented 5 years ago

Hey, this is really cool work - I look forward to your talk! I noticed that the datasets this was trained on were image processing datasets. Natural Language Processing also has a lot of memory based issues - we need to remember to use language, and it isn't very different for a computer. While LSTMs attempt to fix this (and to be fair, don't do a half bad job), the model you propose could be really interesting to use for language generation. Have you experimented on textual data?

On that note, I wouldn't need to wait to see if you've experimented on textual data - I could do it myself. Is the tensor flow code for your model public, and if it is, could you share it?

Thanks!

policyglot commented 5 years ago

Great idea from Bhargav, who is himself a published author in Natural Language Processing. I'd like to expand on his idea of language and refer to one of your earlier papers on mnemonics- https://www.pubfacts.com/detail/28539423/Mnemonic-Encoding-and-Cortical-Organization-in-Parietal-and-Prefrontal-Cortices While your work was on monkeys- no offence to monkey intelligence- I would be curious to learn about the differences in computational methods and humans in the ways in which we retrieve stored vocabulary. Are there any areas in which homo sapiens still outperform deep learning in terms of recollecting the accurate word for a given context?

nmasse commented 5 years ago

The feedforward part of the code lives here https://github.com/nmasse/Context-Dependent-Gating and the recurrent part of the code lives here https://github.com/gdgrant/Context-Dependent-Gating-RNN.

I have not tried this on NLP data sets, but would be fun to try. Would love to talk more about it!

nmasse commented 5 years ago

As for the computational methods related to retrieving words, I admit I'm unfamiliar with any literature about this topic. However, how we relate words to the context in which they appear is something I have not really thought about, but as with above, would love to chat more about!