Me: “Information is Physical” said Landauer in 1961 and established a lower limit of thermodynamic cost of information: “Information is not free” basically. The number has been confirmed numerous times, despite many attempts to break it with quantum erasure, although it can be switched to angular momentum instead of heat. Still, information can be LOGICALLY reversible “for free” even if it cannot be PHYSICALLY reversible for free. But the costs can be made physically small to where a being like a human won’t notice much. Ignorance is bliss. The abstract for the paper: —————— Stochastic thermodynamics of computation – David H. Wolpert One of the major resource requirements of computers – ranging from biological cells to human brains to high-performance (engineered) computers – is the energy used to run them. Those costs of performing a computation have long been a focus of research in physics, going back to the early work of Landauer. One of the most prominent aspects of computers is that they are inherently nonequilibrium systems. However, the early research was done when nonequilibrium statistical physics was in its infancy, which meant the work was formulated in terms of equilibrium statistical physics. Since then there have been major breakthroughs in nonequilibrium statistical physics, which are allowing us to investigate the myriad aspects of the relationship between statistical physics and computation, extending well beyond the issue of how much work is required to erase a bit. In this paper I review some of this recent work on the `stochastic thermodynamics of computation’. After reviewing the salient parts of information theory, computer science theory, and stochastic thermodynamics, I summarize what has been learned about the entropic costs of performing a broad range of computations, extending from bit erasure to loop-free circuits to logically reversible circuits to information ratchets to Turing machines. These results reveal new, challenging engineering problems for how to design computers to have minimal thermodynamic costs. They also allow us to start to combine computer science theory and stochastic thermodynamics at a foundational level, thereby expanding both.

Me: “Information is Physical” said Landauer in 1961 and established a lower limit of thermodynamic cost of information: “Information is not free” basically. The number has been confirmed numerous times, despite many attempts to break it with quantum erasure, although it can be switched to angular momentum instead of heat.
 
Still, information can be LOGICALLY reversible “for free” even if it cannot be PHYSICALLY reversible for free. But the costs can be made physically small to where a being like a human won’t notice much. Ignorance is bliss.
 
The abstract for the paper:
——————
Stochastic thermodynamics of computation – David H. Wolpert
 
One of the major resource requirements of computers – ranging from biological cells to human brains to high-performance (engineered) computers – is the energy used to run them. Those costs of performing a computation have long been a focus of research in physics, going back to the early work of Landauer. One of the most prominent aspects of computers is that they are inherently nonequilibrium systems. However, the early research was done when nonequilibrium statistical physics was in its infancy, which meant the work was formulated in terms of equilibrium statistical physics. Since then there have been major breakthroughs in nonequilibrium statistical physics, which are allowing us to investigate the myriad aspects of the relationship between statistical physics and computation, extending well beyond the issue of how much work is required to erase a bit. In this paper I review some of this recent work on the `stochastic thermodynamics of computation’. After reviewing the salient parts of information theory, computer science theory, and stochastic thermodynamics, I summarize what has been learned about the entropic costs of performing a broad range of computations, extending from bit erasure to loop-free circuits to logically reversible circuits to information ratchets to Turing machines. These results reveal new, challenging engineering problems for how to design computers to have minimal thermodynamic costs. They also allow us to start to combine computer science theory and stochastic thermodynamics at a foundational level, thereby expanding both.
[responsivevoice_button voice="US English Male"]

Leave a comment

Your email address will not be published. Required fields are marked *


two × 9 =

Leave a Reply