STOCHASTIC COMPUTING: Cool – something *else* I’ve been looking for. [I get this a lot: I *know* something exists but I don’t know its name or how to describe it properly because there’s a vague mental ‘template’ of what will fit it when I find it, but until I find it, I don’t know what that template looks like nor what will fit it, just that the template is there]. Anyway, this. This kind of interface between analog and digital is a field hardly heard of but hiding in some digital signal processing [working with gamma correction for example — a better gamma correction will use stochastic logic rather than binary logic, resulting in much better quality gamma correction]. Basically? It’s “how to logic” with randomness. Well, pseudo-randomness [true randomness has what seems like patterns, and just like we get confused by what seems like patterns but isn’t, so does stochastic logic]. It requires very low power, low cost hardware and allows for massive parallel processing as it operates on ‘streams’. But there’s a cost: Precision. There are some ways to get perfectly precise stochastic logic circuits that were recently discovered just a few years ago in some cases, but generally, the more precision you want (let’s say you want to go from 4-bit to 8-bit precision), the amount of stochastic logic circuits required becomes massive as you’re parallel processing parallel processes so to speak. There’s a definitely relationship to quantum computing, as it is facing many of the same kinds of issues faced by stochastic computing, although they’re different in many other ways. Still though, this is what “fills that gap” I was looking for, or one of them at least. How I got here was a trip in itself: I was trying to find the *first* instance of a “Leaky Integrator” – a concept common in neural networking, especially biological models simulating brain spikes – and did a trip through history in Google books and google scholar. I came across this reference in 1967: This tiny snippet from a book that Google books isn’t otherwise allowing me to read, says that leaky integrators are the digital equivalent of the “addie” -addies apparently form the basis of stochastic computers, a concept first devised by Von Neumann – yes, same guy. I don’t know where I’ll end up with this, but I know I’m going in the right direction for it.

STOCHASTIC COMPUTING:
Cool – something *else* I’ve been looking for. [I get this a lot: I *know* something exists but I don’t know its name or how to describe it properly because there’s a vague mental ‘template’ of what will fit it when I find it, but until I find it, I don’t know what that template looks like nor what will fit it, just that the template is there].

Anyway, this. This kind of interface between analog and digital is a field hardly heard of but hiding in some digital signal processing [working with gamma correction for example — a better gamma correction will use stochastic logic rather than binary logic, resulting in much better quality gamma correction].

Basically? It’s “how to logic” with randomness. Well, pseudo-randomness [true randomness has what seems like patterns, and just like we get confused by what seems like patterns but isn’t, so does stochastic logic].

It requires very low power, low cost hardware and allows for massive parallel processing as it operates on ‘streams’. But there’s a cost:

Precision.

There are some ways to get perfectly precise stochastic logic circuits that were recently discovered just a few years ago in some cases, but generally, the more precision you want (let’s say you want to go from 4-bit to 8-bit precision), the amount of stochastic logic circuits required becomes massive as you’re parallel processing parallel processes so to speak.

There’s a definitely relationship to quantum computing, as it is facing many of the same kinds of issues faced by stochastic computing, although they’re different in many other ways.

Still though, this is what “fills that gap” I was looking for, or one of them at least.

How I got here was a trip in itself: I was trying to find the *first* instance of a “Leaky Integrator” – a concept common in neural networking, especially biological models simulating brain spikes – and did a trip through history in Google books and google scholar.

I came across this reference in 1967:

This tiny snippet from a book that Google books isn’t otherwise allowing me to read, says that leaky integrators are the digital equivalent of the “addie” -addies apparently form the basis of stochastic computers, a concept first devised by Von Neumann – yes, same guy.

I don’t know where I’ll end up with this, but I know I’m going in the right direction for it.

[responsivevoice_button voice="US English Male"]

Leave a comment

Your email address will not be published. Required fields are marked *


four + = 11

Leave a Reply