my strongest understanding of entropy was from information theory or communications. being very deaf, I’m very aware of the problems that occur with noisy channels: one has to predict what one cannot hear and it can be a headache predicting in real time while in a face to face conversation. Claude Shannon was revolutionary to me. I know information theory entropy isn’t the same as thermodynamics entropy but what they have in common is the difficulties that ensue in the loss of coherence. “A fan diagram shows how channel noise affects the number of possible outputs given a single input, and vice versa. If the noise η in the channel output has entropy H(η) = H(Y |X) then each input value could yield one of 2 H(Y |X) equally probable output values. Similarly, if the noise in the channel input has entropy H(X|Y ) then each output value could have been caused by one of 2 H(X|Y ) equally probable input values.”

my strongest understanding of entropy was from information theory or communications.

being very deaf, I’m very aware of the problems that occur with noisy channels: one has to predict what one cannot hear and it can be a headache predicting in real time while in a face to face conversation.

Claude Shannon was revolutionary to me.

I know information theory entropy isn’t the same as thermodynamics entropy but what they have in common is the difficulties that ensue in the loss of coherence.

“A fan diagram shows how channel noise affects the number of possible outputs given a single input, and vice versa. If the noise η in the channel output has entropy H(η) = H(Y |X) then each input value could yield one of 2 H(Y |X) equally probable output values. Similarly, if the noise in the channel input has entropy H(X|Y ) then each output value could have been caused by one of 2 H(X|Y ) equally probable input values.”

[responsivevoice_button voice="US English Male"]

Leave a comment

Your email address will not be published. Required fields are marked *


+ 8 = fourteen

Leave a Reply