JamieBali / MRSCC

Implementation of multiple approaches to the microsoft research sentence completion challenge, including the suggestion for a new, novel solution.
0 stars 0 forks source link

Hopfield Neutral Network #2

Closed JamieBali closed 2 years ago

JamieBali commented 2 years ago

Each neurone in a HNN (Fully recurrent neural network) could represent a work (excluding stop works, punctuation, etc...) and co-occurance of words in sentences could lead to strengthened connection between words.

This works technically function similarly to an n-gram, but would be fuzzier. The question is, does a fuzzy HNN grant better accuracy - being that it has the capacity to look at the whole sentence at once, but ignores order.

JamieBali commented 2 years ago

eg. Question:

_START The ____ sat on the mat. _END

a) dog b) cat c) sky d) man

Even a bigram would look at this and not be able to figure out a reasonable answer. A trigram world probably pick Sky or Man as the answer. A HNN could be taught within a range to get better data. I.e. use a range of 3 words either side.

Smoothing would be required.

JamieBali commented 2 years ago

This is a good idea for a system which I will suggest, but i don't think it'll really be all that accurate. It's basically just an n-gram that ignores order in exchange for more context. It'll possibly more accurate than an n-gram with less processing, but not significantly. I will wirte about it in the report, however.