dariusk / NaNoGenMo-2015

National Novel Generation Month, 2015 edition.
341 stars 21 forks source link

Beginner's Questions, Answers, & Tutorials #152

Open ikarth opened 8 years ago

ikarth commented 8 years ago

While there are a ton of resources in the #1 Resources thread, many of them assume that you know what fancy terms like NLP, NLTK, RNN, or ConceptNet mean. This thread is for introductory tutorials and starting points, plus answering questions.

Feel free to chime in if you feel you have anything to contribute. I'm not going to try to duplicate the already super-useful resources, so go check out that thread if you want more tools and links to interesting things.

DariusK Livecoding Sessions

Darius has previously recorded several coding sessions that were broadcast live. When you're just starting out, it can sometimes be helpful to watch someone else code like this, because you'll see all of the little details that tend to get left out of even the most introductory tutorials. https://www.youtube.com/watch?v=V9XwFYwyunw https://www.youtube.com/watch?v=y-YIdzaG4OE https://www.youtube.com/watch?v=_DMa_ve3N6o

ikarth commented 8 years ago

Markov Chains

Markov chains are a well-known way to do text generation. When used to generate text, the basic operation is to look at the last few words and then pick the next word based on what words have been observed to follow.

Here's an interactive explanation of Markov chain generation by tullyhansen Allison Parrish's tutorial on N-grams and Markov chains, with Python code Jeff Atwood explains Markov chains via Garfield A visual explanation of Markov chains Andrew Plotkin's Fun With Markov Chains Bookmerge, an example Markov generator that lets you combine two books (uses Ruby)

ikarth commented 8 years ago

Reading and Writing Electronic Text

The course notes from this class at NYU's Interactive Telecommunications Program cover a lot of subjects and are a good introduction if you're wondering where to start. It also includes lots of example code in this github repository.

ikarth commented 8 years ago

Templating and Grammars (with Tracery)

A replacement grammar is a set of rules to replace symbols. It works kind of like mad-libs: we start with a sentence like "Was it done by #suspect# in the #room# with the #murder weapon#?", and then the apply a rule that says that every time we see the word #suspect#, replace it with one of the names from our list of suspects.

While this starts out pretty simple, many of the successful novels from past NaNoGenMos have been made using similar techniques. Aggressive Passive, Redwreath and Goldstar Have Traveled to Deathsgate, Recipe Book Generator, and Threnody for Abraxas are a few examples of novels that use recursive replacement grammars in one way or another.

Tracery is a library that makes replacement grammars easy. You can try out the interactive tutorial for a playful introduction.

Some examples of what you can do with Tracery, including a demo of the in-progress no-programming-required visual editor