1

Communication requires energy, and using energy requires communication. According to Shannon, the entropy value of a piece of information provides an absolute limit on the shortest possible average length of a message without losing information as it is transmitted. (https://towardsdatascience.com/entropy-the-pillar-of-both-thermodynamics-and-information-theory-138d6e4872fa)

I don't know whether Neural Network actually deals with information flow or not. This information flow is taken from the idea of entropy. Since I haven't found any paper or ideas based on the law of energy for neural networks. The law of energy states that energy can neither be created nor destroyed. If it is creating information (energy) (e.g. in the case of a generative model), then some information may be lost while updating weights. How is Neural Network ensuring this energy conservation?

nbro
  • 39,006
  • 12
  • 98
  • 176
  • It is pretty much evident from the article. If you are using Neural Nets to model something difficult Neural Nets would require more weights (or more information) as compared to modelling simple distribution. Consider the case where class 1 and class 2 are very difficult to seperate, then NNs would require a more curvaceous (and hence more wights and nodes) to model the difference. In information theory also when 2 classes have the same probability the entropy is maximum. This is a rough equivalence. Equating information theory with thermodynamics probably requires advanced physics. –  Nov 22 '20 at 19:31
  • BTW statistical mechanics is the field which deals with this. There is a lecture series by Leonard sussikind on this on YouTube. Also RBMs (then known as hopfield nets) have this principle of entropy which was influenced by statistical mechanics, RBMs actually caused quite a stir among physicists due to this property of conserving energy. But the field died down pretty quickly. You can find all these in Raul Rojas Neural networks chapter on auto associative memories (or something like that) –  Nov 22 '20 at 19:37
  • Please, you should still focus on a more specific question. Here you're raising several non-trivial issues. We could start from the question: "Is there an information-theoretic view of neural networks", to which you can find the answer [here](https://ai.stackexchange.com/a/20574/2444). Your question about the "conservation of energy" is apparently based on the assumption that there's information in the NNs, but you don't even know what that means, so your post could lead to a big answer. So, again, you should probably start by reading that answer, then formulate your question again. – nbro Nov 22 '20 at 19:47
  • I personally find this to be an interesting question b/c cost of computation is extremely important (as is waste heat in processing.) – DukeZhou Dec 03 '20 at 01:36

0 Answers0