ml5js / training-charRNN

Training charRNN model for ml5js
Other
96 stars 46 forks source link
charrnn ml5js rnn

Training a charRNN and using the model in ml5js

Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow and modified to work with tensorflow.js and ml5js

Based on char-rnn-tensorflow.

Requirements

Usage

Collect data

RNNs work well when you want predict sequences or patterns from your inputs. Try to gather as much input text data as you can. The more the better. Compile all of the text data into a single text file and make note of where the file is stored (path) on your computer.

(A quick tip to concatenate many small disparate .txt files into one large training file: ls *.txt | xargs -L 1 cat >> input.txt)

Set-up Python Environment

This first step of using a python "virtual environment" (venv video tutorial) is recommended but not required.

$ python3 -m venv your_venv_name
$ source your_venv_name/bin/activate

Train model

Note you can also download this repo as an alternative to git clone.

$ git clone https://github.com/ml5js/training-charRNN
$ cd training-charRNN
$ pip install -r requirements.txt
$ python train.py --data_path /path/to/data/file.txt

Optionally, you can specify the hyperparameters you want depending on the training set, size of your data, etc:

python train.py --data_path ./data \
--rnn_size 128 \
--num_layers 2 \
--seq_length 50 \
--batch_size 50 \
--num_epochs 50 \
--save_checkpoints ./checkpoints \
--save_model ./models

When training is complete a JavaScript version of your model will be available in a folder called ./models (unless you specify a different path.)

Once the model is ready, you'll just need to point to it in your ml5 sketch, for more visit the charRNN() documentation.

const charRNN = new ml5.charRNN('./models/your_new_model');

That's it!

Hyperparameters

Given the size of the training dataset, here are some hyperparameters that might work:

Note: output_keep_prob 0.75 is equivalent to dropout probability of 0.25.

Additional resources