intel-analytics / BigDL-Tutorials

Step-by-step Deep Leaning Tutorials on Apache Spark using BigDL
208 stars 123 forks source link

Using Word Vectors via the Embedding Layer #52

Closed vijaydwivedi75 closed 6 years ago

vijaydwivedi75 commented 6 years ago

I want to use LSTM network for a text data.

I have the data in the following form. A snapshot is:

label            sequence
1.0              0 0 this is an example

How do I prepare the data to be fed to the LSTM Sequential() model such that input data is of the shape batch_size x seq_length x EMBEDDING_DIM? Is there a way or example to do it as it is done in Keras using the Embedding() layer and embedding_weights (GloVe weights) inside it? [as done here https://machinelearningmastery.com/use-word-embedding-layers-deep-learning-keras/]

vijaydwivedi75 commented 6 years ago

Okay, I found it. There is a walkthrough here. http://psyyz10.github.io/2017/06/Sentiment/

jason-dai commented 6 years ago

See the complete notebook at https://github.com/intel-analytics/analytics-zoo/blob/master/apps/sentiment-analysis/sentiment.ipynb