Tender is the Insight

The Short Story: I made a web-app that, given some starting text, naively tries to predict what words come next. Because the ‘training’ text was taken from F. Scott Fitzgerald’s Tender is the Night (the first 10 chapters), we can (inaccurately) say that this robot talks like Fitzgerald.

Take a look here.

That’s the Hemingway-length take on it. Now let’s see the Dostoevsky-length explanation.

What

Markov chains are a type of computational tool often used to model events that occur with some degree of unpredictability (either because there’s no way to predict them, or because we simply don’t have the right kinds of tools). In other words, a Markov chain can ‘predict’ the next item in a series by looking back at historical data.

Humans are pretty good at this ― try this:

1 2 3 1 2 3 1 2 _

What comes next? If you guessed (spoiler alert!) 3, then congratulations. Your brain is equal to or better than a Markov chain in terms of intelligence.

At a very high level, you just looked for a pattern: “After every 2 comes a 3, so after the last 2 will likely come a 3.” Markov chains make this a concrete process; each item in the series has some probability of following another.

Briefly, a maths

In more mathematical terms, a Markov chain defines a probability distribution of some random variable \(X\) given only the previous event in a series. That is, the probability distribution of my next element is dependent exclusively on the previous element. This is often expanded to more than one previous event (a 2-back Markov chain), as I did here.

Another good example of Markov chains is

Another good example of Markov chains is the weather forecast: Sure, we can look at cloud patterns and where winds are winding, and that helps us form educated guesses about tomorrow’s weather. But one common way to predict the weather is to look at a Markov chain; look at today’s weather, and compare it to every day that has been similar in the past. (Scientists somewhere have gone to the trouble of writing down the weather for hundreds of years, which is sweet of them and we love them.) Intuitively, it’s unlikely that a 95° day is succeeded by a 20° day (unless you switch your units unexpectedly). Markov chains substantiate that by telling us:

45% of days like (75° Sunny, Fall-time) were followed by days that were (77° Sunny).

22% of days like (75° Sunny, Fall-time) were followed by days that were (73° Sunny).

33% of days like (75° Sunny, Fall-time) were followed by days that were (74° Cloudy).

…and so on. So the forecasters can take a look, and, hey, it looks like 77-and-sunny tomorrow.

But this time with words

Consider this sentence:

I am who I say I am.

If I ask you what comes next in this pattern, it’s hard to intuit the correct answer. But Markov chains can guess using these rules:

Every time I see the word I, there’s a 66% chance of the next word being am.

Every time I see the word am, it’s either followed by nothing or who.

Every time I see the word who, it’s followed by say.

Every time I see the word say, it’s followed by I.

So if we ask the computer, what comes next?

I am who I say I _

We review our rules, and input am.

(It’s worth noting that if you gave it a word it had never seen before, the program would have to guess what to respond with.)

Code

Take a look at the source-code (all client-side JavaScript, so it runs in the browser without a server needed) on GitHub. Edit the enormous ‘sentences’ array with your own text (it’s some documentation to OpenCV right now, because it was the first free-open-source text I found)

Written on September 26, 2015
Comments? Let's chat on mastodon (Or on Twitter if you absolutely must...)