Open rbechtel opened 9 years ago
Well, I'm definitely interested. I don't think that too many people have used prior storytelling AI research code in NaNoGenMo before. I think it's a very good idea.
At the very least it'll be interesting to contrast the original Tale-Spin approach with how contemporary sensibilities lead you to develop your own spin on it. I feel like the increased literacy about cybertext has opened up new avenues for exploration, and I'm curious about what the results are going to be.
Oh hey, I didn't realise source for tale-spin-esque things existed. That's really useful.
Well, didn't get to 50K. I could probably push it and cross the line, but I'm not interested in the rather mechanical result that could wrap it up, so I'll just leave things as they are. Code and a final sample (about 10K) is in the repository. Already looking forward to next year, though I'm not sure that I'll be able to resist working on things between now and then.
(impressive)
wow despite not being 50k, output is impressive for sure.
I was disturbed by CHAPTER 7. Louise wanted to know where the fish were, and gave Irving the worm. There is no evidence Irving kept his end of the deal. Quoting from the story: Louise thought that if she would give Irving the worm then Irving would tell her where the fish was. ... Irving had the worm. ... Irving ate the worm. ... Louise wanted to know where the fish was. Louise thought that she did not have the fish.
What went wrong?
Dave
Then Louise killed Joe for the knowledge but didn't get it and the zombie Joe didn't know where the fish were, and Irving got another worm and still didn't tell. Riveting I say! Riveting !
But zombie Joe did know that Louise struck (and killed) him, so that's something. Good thing that Louise didn't think that she dominated Irving, or we'd have a serial killer on our hands...
I'm in.
At this point, I have no idea what the novel will be about. Instead, I have some ideas about the generation process. I'm going to start with Warren Sack's Common Lisp version of Micro-Talespin (http://eliterature.org/images/microtalespin.txt), mostly for familiarity. I used the Micro-Mumble generation engine from the original Micro-Talespin (converting the original UCI Lisp source to Interlisp) in my dissertation work (Jim Meehan was my advisor), but it's been a very long time, so it's doubtful that any of that past experience will be all that helpful.
I have cheated a bit - I started looking at the code and did a few tweaks as I rebuilt my understanding. I'll include the Sack original and all my "pre-start" changes in an initial check-in after the starting gun sounds. Unfortunately, I don't seem to have my dissertation code, but it would all have to be converted to Common Lisp anyway, so perhaps that's no great loss.
Oversimplifying, I see four development areas: story setup, simulation, story-level decisions, and tactical generation. This is roughly "where do we start", "what happens", "what do we tell the reader and when", and "how do we say it." In Micro-Talespin, there are nearly-separable mechanisms for the simulator and tactical generation, but any story-level decisions are implicitly baked into the code, mostly amounting to "tell them everything in the order it occurs." (The simulator and generator are OK to illustrate the approach, but both will need work to produce something reasonable.)
I'm also interested in looking at using other sources of "what happens", such as event records extracted from news stories (e.g., http://www.gdeltproject.org/) in place of the simulator in Micro-Talespin, but whether the result would be "novelistic" is an open question.
The Talespin approach is OK for an episode. It remains to be seen if it can sustain a novel-length story. The idea of a framing story (as used by others) is one possibility, but there's clearly lots of experimentation ahead.
Yes, far too ambitious. We'll see where things are at the end of November.
Repo is here: https://github.com/rbechtel/NaNoGenMo