IndicoDataSolutions / Passage

A little library for text analysis with RNNs.
MIT License
530 stars 134 forks source link

Use as a Tagging Model? #13

Open simonhughes22 opened 9 years ago

simonhughes22 commented 9 years ago

This is more of a feature request.

I'd love to try to use an LTSM model as as tagging model. I have tagged words for my training data (not POS tags or any common NLP tagging problem). The previous tags can influence the current word's tags. Is it possible to use this library as a word tagger? Right now it looks like it trains on an entire document, sequentially, but with one target label per document.

jacobmenick commented 9 years ago

I'd like to piggyback on this question. It's unclear how to do multiclass classification from the example given. Presumably you change the activation to 'softmax', change the cost to 'CategoricalCrossEntropy', and change the size of the output 'Dense' layer to the number of classes.

Another question - Is the example given predicting a binary random variable for every step in the sequence? Or is the binary random variable predicted from the sequence itself ? I.e how many output nodes are there, and what are the connections from the hidden layer, in the example?

Thanks, Jacob

Slater-Victoroff commented 9 years ago

@Newmu Bumping this up on your radar.

Apologies for slow responses, Alec has been pretty sunk under water for the past couple of weeks.

We're always thrilled for more examples, and changing to SoftMax + CCE should work (trying it out on the blogger dataset would be a good example imo), let us know if you run into any issues, but we'd love to see a PR for a multiclass classification example if you wouldn't mind contributing one.

youralien commented 9 years ago

I added a multi-class classification example using the blogger gender data set here: https://github.com/IndicoDataSolutions/Passage/pull/39 This addresses the multi class case for a single output at the end of the sequence

However, I think the tagging model that @simonhughes22 is really talking about is a classifier at each step in the RNN. This problem is more akin to something like a Neural Language Model. I see the similarity in that the Softmax which is emitted each step is predicting

This sequence output support is being addressed in a similar, but separate, PR: https://github.com/IndicoDataSolutions/Passage/issues/5