Markov chain generators, which go back to the early 1990s, did a decent job outputting nonsense in a similar style as a writer. The sentences almost but not quite made sense. Markov chains have no memory; they simply predict based on some window of most recent writing that moves along as it writes. It’s a simple algorithm. But combining the two: the no-memory Markov chain that can generate nonsense in a particular style, with $20 billion dollar databases that can summarize really well (they have extensive memory) but can’t do styles well (which require real time agility) – well, whoever does that will have the magic key. Might not even be possible.

Markov chain generators, which go back to the early 1990s, did a decent job outputting nonsense in a similar style as a writer.
The sentences almost but not quite made sense. Markov chains have no memory; they simply predict based on some window of most recent writing that moves along as it writes. It’s a simple algorithm.
But combining the two: the no-memory Markov chain that can generate nonsense in a particular style, with $20 billion dollar databases that can summarize really well (they have extensive memory) but can’t do styles well (which require real time agility) – well, whoever does that will have the magic key. Might not even be possible.

[responsivevoice_button voice="US English Male"]

Leave a comment

Your email address will not be published. Required fields are marked *


three + = 5

Leave a Reply