Open enkiv2 opened 9 years ago
I will do my best not to lapse into template-related rants in GitHub.
I will not be displeased by more template-related rants on github. Ranting is how we clarify our own thoughts, and are also a wonderful didiactic resource. Rant away.
On Mon, Oct 26, 2015 at 11:38 AM Michael Paulukonis < notifications@github.com> wrote:
I will do my best not to lapse into template-related rants in GitHub.
— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/16#issuecomment-151177586 .
Templates seem like a terrible way to produce sentences until you consider the alternatives.
There, that's my template-related rant.
Most of the limitations of templates can be circumvented by generating them with templates.
On Tue, Oct 27, 2015 at 11:42 AM Chris Pressey notifications@github.com wrote:
Templates seem like a terrible way to produce sentences until you consider the alternatives.
There, that's my template-related rant.
— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/16#issuecomment-151545054 .
I'm a huge fan of templated templates with nondeterministic inputs.
Templates are great in that they allow arbitrary amounts of human intelligence to be provided in order to reliably trigger the eliza effect and convince readers that there was some intent behind the writing, which is often enough to get them to project a meaning upon it. But, for novel-length text, single-layer templates like mad libs don't produce a very good ratio of novel text to source code. By using generative grammars, I think you can get to a nice middle ground between mad lib style single-level templating and something like markov chains (wherein you have essentially used statistics to derive a really low level template of the form "this word may only be followed by these other words" from some source text). Nick Montfort's 1k story generator demonstrates the power of being vague but evocative, and by adding a random shuffle to the lines it produces you can convince yourself that if you are sufficiently vague and evocative the order of events doesn't need to matter much either; so, if you can create a grammar that produces a large variety of vague yet evocative filler text, you can insert such text into arbitrary places in a plot skeleton to fill it out.
On Tue, Oct 27, 2015 at 11:51 AM Darius Kazemi notifications@github.com wrote:
I'm a huge fan of templated templates with nondeterministic inputs.
— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/16#issuecomment-151548025 .
(We spend a lot of time talking about particular iconic experimental stories like Finnegan's Wake, A Void, and Exercises in Style when talking about what kinds of extreme storytelling forms work at least well enough that there are iconic works written by humans in that form. But, one model that we might look at is the Illuminatus trilogy, which has a lot of complicated interactions between various parties and a large number of events but whose chronology is difficult to determine. The Illuminatus trilogy easily explains its formal conceits with its subject matter: sex, drugs, conspiracy theories, and the occult. However, the formal conceits had help from the same 'writing machine' that Burroughs used in The Naked Lunch: somebody dropped the entire fifteen hundred page manuscript on the floor and put it back together in a random order, and then the authors cut five hundred pages from the result by removing pages entirely at random. A generator that produces descriptions of events and varies the tone and style of these descriptions can be set free to scribble together random accounts of random events and, as long as the characters take the bad acid in one of these scenarios, really warped first-person accounts can be assumed to be justifiable as a kind of intellectual realism.)
On Tue, Oct 27, 2015 at 11:59 AM John Ohno john.ohno@gmail.com wrote:
Templates are great in that they allow arbitrary amounts of human intelligence to be provided in order to reliably trigger the eliza effect and convince readers that there was some intent behind the writing, which is often enough to get them to project a meaning upon it. But, for novel-length text, single-layer templates like mad libs don't produce a very good ratio of novel text to source code. By using generative grammars, I think you can get to a nice middle ground between mad lib style single-level templating and something like markov chains (wherein you have essentially used statistics to derive a really low level template of the form "this word may only be followed by these other words" from some source text). Nick Montfort's 1k story generator demonstrates the power of being vague but evocative, and by adding a random shuffle to the lines it produces you can convince yourself that if you are sufficiently vague and evocative the order of events doesn't need to matter much either; so, if you can create a grammar that produces a large variety of vague yet evocative filler text, you can insert such text into arbitrary places in a plot skeleton to fill it out.
On Tue, Oct 27, 2015 at 11:51 AM Darius Kazemi notifications@github.com wrote:
I'm a huge fan of templated templates with nondeterministic inputs.
— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/16#issuecomment-151548025 .
Also, Illuminatus! (which I loved as a kid) is total potboiler pulpy genre fiction, which has its own advantages when it comes to generation.
Definitely. There are several fairly mechanical plotting guides for pulp, and at the level of sentences and words, various pulp genres have really strong stylistic conventions that are entertaining in of themselves (thinking here of the narration style in hardboiled/noir, and the dense pseudo-romantic style favored by Lovecraft that's essentially just a more extreme form of what William Hope Hodgeson and Poe were doing in the nineteenth century). Doing an Illuminatus pastiche might give you a good excuse for incorporating filters like jive and cockney into your fiction generator as a second pass for particular passages and quotes.
On Tue, Oct 27, 2015 at 12:11 PM Darius Kazemi notifications@github.com wrote:
Also, Illuminatus! (which I loved as a kid) is total potboiler pulpy genre fiction, which has its own advantages when it comes to generation.
— Reply to this email directly or view it on GitHub https://github.com/dariusk/NaNoGenMo-2015/issues/16#issuecomment-151553724 .
On GenText, I hand-wrote some examples of "fixed" passages that can be randomly assembled and then linked together by transitions. The end result is that you can get different plots, which make sense because of the transitions. Recently, I got time to convert my handwritten approach into an algorithm (which you can see here). I am doubtful if this approach can scale up to 50,000 words, but I am likely going to use this approach for the Dartmouth Turing Test for the Creative Arts.
My idea of organizing plot by random reordering of passages came from a 1990's computer program: Dramatica. Dramatica is a computer program that is able to generate out a fairy complex outline for a "grand argument story", though it does expect you to use that outline to handwrite out the story. Each outline has 4 subplots, and each subplot represents a certain "theme" (let's say Manipulation). Each subplot is composed of four different "acts", each representing a certain aspect of 'Manipulation''. However, it is the _order_ of these acts that matter in determining the effect of the subplot on the greater plot. Obviously, Dramatica is a far more complex program than just randomly shuffling events within an array and then popping them out. There are additional rules involved to ensure that the "grand argument story" is logical and has no holes that a reader can use to attack its point. But most of these rules have nothing to do with the organization of the plot structure, and everything to do with how it is presented within the text (characterization, etc.).
I am doubtful if this approach can scale up to 50,000 words
A collection of short stories, with optional inter-connecting material is one solution. There were a bunch of small-pieces-collected-at-length submitted last year (mine amongst them). I had considered a framing device of a reader reading the book in a mansion's library as intro and outro, with minor sentences between stories, but never got around to it. There's a good tradition of this sort of thing. Canterbury Tales, The Decameron, etc.
Continuing from discussions on the Gen Text group over the past few months, I'm thinking that one way to approach readable text at novel length is to create a library of pretty generic generative grammars for hunks of text ranging in size from individual words to whole paragraphs and then use some existing plot template system (like the b2 beat sheet or the plotto system) to create a high level plot overview, and then do the minimum work to glue the two together.
I will probably be using gg, because although tracery is almost a direct equivalent in terms of features and form, I'd prefer to work with python over working with javascript.