0

BACKGROUND: In thermodynamics, entropy $S$ is a measure of disorder and is given by $${\displaystyle S=k_B\log(W)},$$ where $k_B$ is Boltzman's constant and $W$ is the number of microstates.

In information theory, (Shannon) entropy $H$ is a measure of uncertainty and is given by $${\displaystyle H=-\sum _{i}p_{i}\log p_{i}},$$ where $p_i$ is the probability of a given state.

QUESTION: Given the conceptual similarity (disorder ~ uncertainty) and mathematical symmetry ($log$ of some $state$), are these two entities merely similar or are they in fact equivalent? A philosophical and mathematical explanation is desired.

As an aside, I will to point out that philosophical inquiries are within bounds for SE:AI topics.

Snehal Patel
  • 912
  • 1
  • 1
  • 25

0 Answers0