jcjohnson / torch-rnn

Efficient, reusable RNNs and LSTMs for torch
MIT License
2.5k stars 507 forks source link

Sampling: incremental printing of generated text #71

Open gwern opened 8 years ago

gwern commented 8 years ago

While using sample.lua to generate text to inspect how the quality is changing between checkpoints, it would be nice if each character could be printed as it's generated (like in char-rnn) so you don't have to wait for the full sample to be created before you see any output. This wait is particularly painful if you need to generate large amounts of text to see how it's going; for my metadata training, I need 10-20k characters to see it sample from several different authors, and it takes something like an hour to do that much.

sample.lua calls LanguageModel.lua, and inside sample, I think this could be done by adding a print call using the next_char variable?

AlekzNet commented 8 years ago

Have a look at this PR: https://github.com/jcjohnson/torch-rnn/pull/8

danindiana commented 7 years ago

I agree that the output being shown is in some cases desirable and a user-friendly aspect of char-rnn. Even more troubling, however is that even when I attempt to sample the output using a modifier like | tee when writing a file torch-rnn doesn't show the output. Maybe this is a bug related to #158 ?