dariusk / NaNoGenMo-2015

National Novel Generation Month, 2015 edition.
340 stars 21 forks source link

1940s la horror noire #51

Open bredfern opened 9 years ago

bredfern commented 9 years ago

Well that's the idea but likely going to be very naked lunch esque, an excuse to attempt something really interesting with node + react

enkiv2 commented 9 years ago

Looking forward to it!

On Thu, Oct 29, 2015 at 3:13 PM Brian Redfern notifications@github.com wrote:

Well that's the idea but likely going to be very naked lunch esque, an excuse to attempt something really interesting with node + react

— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/51.

bredfern commented 9 years ago

Yeah its a little more Finnigan's Wake meets Lovecraft right now until I get my neural network better trained lol.

On Thu, Oct 29, 2015 at 12:18 PM, John Ohno notifications@github.com wrote:

Looking forward to it!

On Thu, Oct 29, 2015 at 3:13 PM Brian Redfern notifications@github.com wrote:

Well that's the idea but likely going to be very naked lunch esque, an excuse to attempt something really interesting with node + react

— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/51.

— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/51#issuecomment-152292250 .

MichaelPaulukonis commented 9 years ago

Finnigan's Wake meets Lovecraft

aaaaand, why would it need further training? !!!

riverrun, past Abhoth and C'thalpa's, from non-euclidean swerve of shore to hideous bend of abominable pre-human swamp, brings us by an incomprehensible commodius vicus of recirculation back to Leng Plateau and Environs.

bredfern commented 9 years ago

Here's a sample of what I have so far:

"Welcog. I have found room that if in crimbling stone-like and more Cwrongen had curiously insectures accaisions - ruck their bits-rathheres had graned materials, companions and tower aperthing and sinist-opashing of the more shapes, all youngle sort my net other contrictify, and of light, and of this length of agevent clesting that other promito, beforeward three membersalay even questions, we spate, the fuss which by the base I race no rock place on feet away with said:-"

On Thu, Nov 5, 2015 at 6:51 AM, Michael Paulukonis <notifications@github.com

wrote:

Finnigan's Wake meets Lovecraft

aaaaand, why would it need further training? !!!

riverrun, past Abhoth and C'thalpa's, from non-euclidean swerve of shore to hideous bend of abominable pre-human swamp, brings us by an incomprehensible commodius vicus of recirculation back to Leng Plateau and Environs.

— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/51#issuecomment-154081850 .

bredfern commented 9 years ago

I think it turns out better when I have only Lovecraft text, I mixed in the text off of Dracula and it watered down my model lol.

Although I may just roll with the 900 page text I generated from a combination of lovecraft and crowley, thinking of hooking this into a text to speech and a widget that is pulling images off google image search and displaying that while the text is being spoken by the computer voice.

On Thu, Nov 5, 2015 at 7:31 AM, Brian Redfern brianwredfern@gmail.com wrote:

Here's a sample of what I have so far:

"Welcog. I have found room that if in crimbling stone-like and more Cwrongen had curiously insectures accaisions - ruck their bits-rathheres had graned materials, companions and tower aperthing and sinist-opashing of the more shapes, all youngle sort my net other contrictify, and of light, and of this length of agevent clesting that other promito, beforeward three membersalay even questions, we spate, the fuss which by the base I race no rock place on feet away with said:-"

On Thu, Nov 5, 2015 at 6:51 AM, Michael Paulukonis < notifications@github.com> wrote:

Finnigan's Wake meets Lovecraft

aaaaand, why would it need further training? !!!

riverrun, past Abhoth and C'thalpa's, from non-euclidean swerve of shore to hideous bend of abominable pre-human swamp, brings us by an incomprehensible commodius vicus of recirculation back to Leng Plateau and Environs.

— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/51#issuecomment-154081850 .

MichaelPaulukonis commented 9 years ago

Looks like it's working with letters atomically, instead of words. I'm not familiar with the model, but can you change the atomicity?

Not that I don't like the output.

bredfern commented 9 years ago

Yeah its a statistical model that goes character by character, so I need to run it a lot longer with much higher temperature settings, it needs a bigger model too I need to scrape every single lovecraft text into one data file, right now its only a couple things in there once it has every single lovecraft text in one file and runs that at high temp it should produce a better result.

On Thu, Nov 5, 2015 at 7:40 AM, Michael Paulukonis <notifications@github.com

wrote:

Looks like it's working with letters atomically, instead of words. I'm not familiar with the model, but can you change the atomicity?

Not that I don't like the output.

— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/51#issuecomment-154097040 .

enkiv2 commented 9 years ago

Does this use char-rnn? I rather like the letter-granularity of the input (and RNNs do better at producing real words out of character-granularity input with relatively small training data than, say, third- or fourth-order markov chains do), but it looks like you'd get more coherence if you doubled or tripled the input.

If you're looking for something closer to the Lovecraft end of the style spectrum than the Dracula end, you might look into some of the late eighteenth century authors of weird fiction that Lovecraft aped: Algernon Blackwood, William Hope Hodgeson, & Robert Chambers.

On Thu, Nov 5, 2015 at 10:40 AM Michael Paulukonis notifications@github.com wrote:

Looks like it's working with letters atomically, instead of words. I'm not familiar with the model, but can you change the atomicity?

Not that I don't like the output.

— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/51#issuecomment-154097040 .

bredfern commented 9 years ago

Yeah it uses char-nn so its all about the quality of the model going in, I might need to write an extra piece of node code that scrapes the web for sources and adds to the data file, lovecraft is out of copyright so its a good choice, but yeah there are other great authors in this genre who are not in copyright lockdown.

On Thu, Nov 5, 2015 at 7:52 AM, John Ohno notifications@github.com wrote:

Does this use char-rnn? I rather like the letter-granularity of the input (and RNNs do better at producing real words out of character-granularity input with relatively small training data than, say, third- or fourth-order markov chains do), but it looks like you'd get more coherence if you doubled or tripled the input.

If you're looking for something closer to the Lovecraft end of the style spectrum than the Dracula end, you might look into some of the late eighteenth century authors of weird fiction that Lovecraft aped: Algernon Blackwood, William Hope Hodgeson, & Robert Chambers.

On Thu, Nov 5, 2015 at 10:40 AM Michael Paulukonis < notifications@github.com> wrote:

Looks like it's working with letters atomically, instead of words. I'm not familiar with the model, but can you change the atomicity?

Not that I don't like the output.

— Reply to this email directly or view it on GitHub < https://github.com/dariusk/NaNoGenMo-2015/issues/51#issuecomment-154097040

.

— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/51#issuecomment-154100313 .

bredfern commented 9 years ago

Now I have one data file with all of lovecraft's ficiton in it, the cool thing is that with a single author model you don't get any nasty dropouts in terms of training loss, when you try to mash of several authors you'll hit a nasty training loss issue.

On Thu, Nov 5, 2015 at 7:59 AM, Brian Redfern brianwredfern@gmail.com wrote:

Yeah it uses char-nn so its all about the quality of the model going in, I might need to write an extra piece of node code that scrapes the web for sources and adds to the data file, lovecraft is out of copyright so its a good choice, but yeah there are other great authors in this genre who are not in copyright lockdown.

On Thu, Nov 5, 2015 at 7:52 AM, John Ohno notifications@github.com wrote:

Does this use char-rnn? I rather like the letter-granularity of the input (and RNNs do better at producing real words out of character-granularity input with relatively small training data than, say, third- or fourth-order markov chains do), but it looks like you'd get more coherence if you doubled or tripled the input.

If you're looking for something closer to the Lovecraft end of the style spectrum than the Dracula end, you might look into some of the late eighteenth century authors of weird fiction that Lovecraft aped: Algernon Blackwood, William Hope Hodgeson, & Robert Chambers.

On Thu, Nov 5, 2015 at 10:40 AM Michael Paulukonis < notifications@github.com> wrote:

Looks like it's working with letters atomically, instead of words. I'm not familiar with the model, but can you change the atomicity?

Not that I don't like the output.

— Reply to this email directly or view it on GitHub < https://github.com/dariusk/NaNoGenMo-2015/issues/51#issuecomment-154097040

.

— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/51#issuecomment-154100313 .

bredfern commented 9 years ago

Yeah now using every Lovecraft text as a source you get more interesting output:

The Great Old Ones which he had been heard the strange college of the strange part of the streets of the streets of the strange and structures of the process of the substance of the things to the party, and the face of the colour of the rest of the strange things that we had been a surprising through the colour of the strange and strange and parts of the strange plane of the strange and scrapter of the secret of the chance of the strange and start of the shocking far as it was a companion.

On Thu, Nov 5, 2015 at 8:39 AM, Brian Redfern brianwredfern@gmail.com wrote:

Now I have one data file with all of lovecraft's ficiton in it, the cool thing is that with a single author model you don't get any nasty dropouts in terms of training loss, when you try to mash of several authors you'll hit a nasty training loss issue.

On Thu, Nov 5, 2015 at 7:59 AM, Brian Redfern brianwredfern@gmail.com wrote:

Yeah it uses char-nn so its all about the quality of the model going in, I might need to write an extra piece of node code that scrapes the web for sources and adds to the data file, lovecraft is out of copyright so its a good choice, but yeah there are other great authors in this genre who are not in copyright lockdown.

On Thu, Nov 5, 2015 at 7:52 AM, John Ohno notifications@github.com wrote:

Does this use char-rnn? I rather like the letter-granularity of the input (and RNNs do better at producing real words out of character-granularity input with relatively small training data than, say, third- or fourth-order markov chains do), but it looks like you'd get more coherence if you doubled or tripled the input.

If you're looking for something closer to the Lovecraft end of the style spectrum than the Dracula end, you might look into some of the late eighteenth century authors of weird fiction that Lovecraft aped: Algernon Blackwood, William Hope Hodgeson, & Robert Chambers.

On Thu, Nov 5, 2015 at 10:40 AM Michael Paulukonis < notifications@github.com> wrote:

Looks like it's working with letters atomically, instead of words. I'm not familiar with the model, but can you change the atomicity?

Not that I don't like the output.

— Reply to this email directly or view it on GitHub < https://github.com/dariusk/NaNoGenMo-2015/issues/51#issuecomment-154097040

.

— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/51#issuecomment-154100313 .

bredfern commented 9 years ago

Ok i got a decent result with torch interestingly it did better with 4 layers than 10 using all of lovecraft as a source, 10 layers requires a lot larger data spurce 4 seemed to be the sweet spot. I need to process the result into a pretty web format for consumption or at least convert to epub from txt.

hugovk commented 9 years ago

A simple way of making a web-format is to generate markdown. Then if it's on GitHub, it'll render it nicely.

And tools like multimarkdown can convert MD to HTML easily.

And then Chrome can be used to print HTML to PDF (shout if you'd like some CSS pointers how to format printable pages).

bredfern commented 9 years ago

Yeah another approach I'm thinking of is using text to speech and then having an ajax call pull down the data file and use that as the source for a text to speech running where I have this animation doing a reading of the text, it works better hearing it out loud than reading off the page cause you can let it run in the background with text to speech.

On Sat, Nov 14, 2015 at 12:52 AM, Hugo van Kemenade < notifications@github.com> wrote:

A simple way of making a web-format is to generate markdown. Then if it's on GitHub, it'll render it nicely.

And tools like multimarkdown can convert MD to HTML easily.

And then Chrome can be used to print HTML to PDF (shout if you'd like some CSS pointers how to format printable pages).

— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/51#issuecomment-156671821 .