Implementation of handwriting generation with use of recurrent neural networks in tensorflow. Based on Alex Graves paper (https://arxiv.org/abs/1308.0850).
First you need to download dataset. This requires you to register on this page ("Download" section). After registration you will be able to download the data/original-xml-part.tar.gz. Unpack it in repository directory.
python preprocess.py
This scipt searches local directory for xml
files with handwriting data and does some preprocessing like normalizing data and spliting strokes in lines. As a result it should create data
directory with preprocessed dataset.
python train.py
This will launch training with default settings (for experimentation look at argparse
options). By default it creates summary
directory with separate experiment
directories for each run. If you want to restore training provide a path to the experiment you want to continue. Like:
python train.py --restore=summary\experiment-0
You can lookup losses in command line or with tensorboard. Example loss plot:
With default settings training took about 5h (using tensorflow 1.2, with GTX 1080).
python generate.py --model=path_to_model
When model is trained you can use generate.py
scipt to test how it works. Without providing --text
argument this script will ask you what to generate in a loop.
Additional options for generation:
--bias
(float
) - with higher bias generated handwriting is more clear so to speak (read paper for more info)--noinfo
- plots only generated handwriting (without attention window)--animation
- animation of writing--style
- style of handwriting, int
from 0 to 7 (functionality added thanks to @kristofbc, you can look how each style looks like in imgs
folder)python generate.py --noinfo --text="this was generated by computer" --bias=1.
python generate.py --noinfo --animation --text="example of animation " --bias=1.
Any feedback is welcome :smile: