I’m more familiar with entropy from the information theory view or communication theory – information loss — that is, loss of structure or rather, discernable patterns become fewer.
You have sender and receiver and noise. The higher the noise, the higher the entropy. There’s clever ways to use noise to change the original signal in meta-useful ways (that is, following a meta-set of rules that are of a higher order than the information being encoded – I’m thinking of evolutionary things – but this diagram is a useful starting point I think in shannon’s entropic view.