Communication requires energy, and using energy requires communication. According to Shannon, the entropy value of a piece of information provides an absolute limit on the shortest possible average length of a message without losing information as it is transmitted. (https://towardsdatascience.com/entropy-the-pillar-of-both-thermodynamics-and-information-theory-138d6e4872fa)
I don't know whether Neural Network actually deals with information flow or not. This information flow is taken from the idea of entropy. Since I haven't found any paper or ideas based on the law of energy for neural networks. The law of energy states that energy can neither be created nor destroyed. If it is creating information (energy) (e.g. in the case of a generative model), then some information may be lost while updating weights. How is Neural Network ensuring this energy conservation?