OK, can someone please just stop the philosophy and give numerical predictions of how entropy helps you predict the future?
The original notion of entropy, and the first one you should study, is the Clausius entropy.
For entropy in chemistry see: entropy of a chemical reaction.
The Biggest Ideas in the Universe | 20. Entropy and Information by Sean Carroll (2020)
Source. In usual Sean Carroll fashion, it glosses over the subject. This one might be worth watching. It mentions 4 possible definitions of entropy: Boltzmann, Gibbs, Shannon (information theory) and John von Neumann (quantum mechanics).- www.quantamagazine.org/what-is-entropy-a-measure-of-just-how-little-we-really-know-20241213/ What Is Entropy? A Measure of Just How Little We Really Know. on Quanta Magazine attempts to make the point that entropy is observer dependant. TODO details on that.