= Statistical physics {wiki} = Statistical mechanics {parent=Statistical physics} {wiki} = Kinetic theory of gases {parent=Statistical mechanics} {wiki} Theory that gases are made up of a bunch of small billiard balls that don't interact with each other. This theory attempts to deduce/explain properties of matter such as the in terms of . = Sedimentation {parent=Statistical physics} {wiki} = Maxwell-Boltzmann vs Bose-Einstein vs Fermi-Diract statisics {c} {parent=Statistical physics} A good conceptual starting point is to like the example that is mentioned at . Consider a system with 2 particles and 3 states. Remember that: * in ( and ), particles are indistinguishable, therefore, we might was well call both of them `A`, as opposed to `A` and `B` from non-quantum statistics * in , two particles may occupy the same state. In Therefore, all the possible way to put those two particles in three states are for: * : both A and B can go anywhere: || State 1 || State 2 || State 3 | AB | | | | AB | | | | AB | A | B | | B | A | | A | | B | B | | A | | A | B | | B | A * : because A and B are indistinguishable, there is now only 1 possibility for the states where A and B would be in different states. || State 1 || State 2 || State 3 | AA | | | | AA | | | | AA | A | A | | A | | A | | A | A * : now states with two particles in the same state are not possible anymore: || State 1 || State 2 || State 3 | A | A | | A | | A | | A | A Both and tend to the in the limit of either: * high * low concentrations TODO: show on forumulas. TODO experimental data showing this. Please..... = Maxwell-Boltzmann distribution {c} {parent=Maxwell-Boltzmann vs Bose-Einstein vs Fermi-Diract statisics} {title2=MB distribution} {wiki=Maxwell–Boltzmann_distribution} = Maxwell-Boltzmann statistics {c} {parent=Maxwell-Boltzmann distribution} {wiki} = Experimental verification of the Maxwell-Boltzmann distribution {parent=Maxwell-Boltzmann distribution} Most confirm the theory, but don't give a very direct proof of its curve. Here we will try to gather some that do. = Zartman Ko experiment {c} {parent=Experimental verification of the Maxwell-Boltzmann distribution} Measured particle speeds with a rotation barrel! OMG, pre equipment? * https://bingweb.binghamton.edu/~suzuki/GeneralPhysNote_PDF/LN19v7.pdf * https://chem.libretexts.org/Bookshelves/Physical_and_Theoretical_Chemistry_Textbook_Maps/Book%3A_Thermodynamics_and_Chemical_Equilibrium_(Ellgen)/04%3A_The_Distribution_of_Gas_Velocities/4.07%3A_Experimental_Test_of_the_Maxwell-Boltzmann_Probability_Density = Stern-Zartman experiment {c} {parent=Zartman Ko experiment} {title2=1920} Is it the same as ? TODO find the relevant papers. * https://encyclopedia2.thefreedictionary.com/Stern-Zartman+Experiment = Application of the Maxwell-Boltzmann distribution {parent=Experimental verification of the Maxwell-Boltzmann distribution} = Applications of the Maxwell-Boltzmann distribution {synonym} https://edisciplinas.usp.br/pluginfile.php/48089/course/section/16461/qsp_chapter7-boltzman.pdf mentions * * as it calculates how likely it is for particles to overcome the = Quantum statistics {parent=Maxwell-Boltzmann vs Bose-Einstein vs Fermi-Diract statisics} {{wiki=Particle_statistics#Quantum_statistics}} = Bose-Einstein statistics {c} {parent=Quantum statistics} {title2=BE statistics} {wiki=Bose–Einstein_statistics} Start by looking at: . = Fermi-Dirac statistics {c} {parent=Quantum statistics} {title2=FD statistics} {wiki=Fermi–Dirac_statistics} Start by looking at: . = Quantum statistical mechanics {parent=Fermi-Dirac statistics} {wiki} Bibliography: * https://stanford.edu/~jeffjar/statmech/lec3.html = Thermodynamics {parent=Statistical physics} {wiki} = Boltzmann constant {c} {parent=Thermodynamics} {title2=$k_B$, $1.38×10^{-23}$} {wiki} This is not a truly "fundamental" constant of nature like say the or the . Rather, it is just a definition of our temperature scale, linking average microscopic energy to our macroscopic temperature scale. The way to think about that link is, at 1 , each particle has average energy: $$1/2 kT$$ per degree of freedom. This is why the units of the Boltzmann constant are per . For an ideal , say , there are 3 degrees of freedom. so each helium atom has average energy: $$3/2 k_B T$$ If we have 2 atoms at 1 K, they will have average energy $6/2 k_B J$, and so on. Another conclusion is that this defines as being proportional to the total energy. E.g. if we had 1 helium atom at 2 K then we would have about $6/2 k_B J$ energy, 3 K $9/2 k_B J$ and so on. This energy is of course just an average: some particles have more, and others less, following the . = Equipartition theorem {parent=Thermodynamics} {wiki} = Thermodynamic potential {parent=Thermodynamics} {wiki} https://chemistry.stackexchange.com/questions/7696/how-do-i-distinguish-between-internal-energy-and-enthalpy/7700#7700 has a good insight: \Q[To summarize, internal energy and enthalpy are used to estimate the thermodynamic potential of the system. There are other such estimates, like the Gibbs free energy G. Which one you choose is determined by the conditions and how easy it is to determine pressure and volume changes.] = Enthalpy {parent=Thermodynamic potential} {title2=$H$} {wiki} Adds up chemical energy and kinetic energy. Wikipedia mentions however that the kinetic energy is often negligible, even for gases. The sum is of interest when thinking about reactions because chemical reactions can change the number of molecules involved, and therefore the pressure. To predict if a reaction is spontaneous or not, negative enthalpy is not enough, we must also consider via . Bibliography: * https://chemistry.stackexchange.com/questions/7696/how-do-i-distinguish-between-internal-energy-and-enthalpy = Gibbs free energy {c} {parent=Thermodynamic potential} {title2=$G$} {wiki} TODO understand more intuitively how that determines if a reaction happens or not. $$ \Delta G = \Delta H - T \Delta S $$ At least from the formula we see that: * the more exothermic, the more likely it is to occur * if the entropy increases, the higher the temperature, the more likely it is to occur * otherwise, the lower the temperature the more likely it is to occur A prototypical example of reaction that is exothermic but does not happen at any temperature is combustion. \Video[https://www.youtube.com/watch?v=DKiBA35Nqp4] {title=Lab 7 - Gibbs Free Energy by MJ Billman (2020)} {description=Shows the shift of equilibrium due to temperature change with a color change in a HCl CoCl reaction. Unfortunately there are no conclusions because its student's homework.} = Chemical equilibrium {c} {parent=Gibbs free energy} {wiki} = Reversible reaction {c} {parent=Gibbs free energy} {wiki} I think these are the ones where $\Delta H \times \Delta S > 0$, i.e. and push the reaction in different directions. And so we can use temperature to move the back and forward. \Video[https://www.youtube.com/watch?v=NMIoon-kuQ4] {title=Demonstration of a Reversible Reaction by Rugby School Chemistry (2020)} {description=Hydrated copper(ii) sulfate.} = Equation of state {parent=Thermodynamics} {wiki} = Ideal gas law {parent=Equation of state} {wiki} = Monatomic gas {parent=Ideal gas law} {wiki} = Entropy {parent=Thermodynamics} {wiki} OK, can someone please just stop the philosophy and give numerical predictions of how entropy helps you predict the future? The original notion of entropy, and the first one you should study, is the . For entropy in chemistry see: {child}. * https://www.youtube.com/watch?v=0-yhZFDxBh8 The Unexpected Side of Entropy by Daan Frenkel (2021) \Video[https://www.youtube.com/watch?v=rBPPOI5UIe0] {title=The Biggest Ideas in the Universe | 20. Entropy and Information by (2020)} {description=In usual Sean Carroll fashion, it glosses over the subject. This one might be worth watching. It mentions 4 possible definitions of entropy: Boltzmann, Gibbs, Shannon () and ().} = Clausius entropy {c} {parent=Entropy} {wiki=Entropy_(classical_thermodynamics)} = Carnot cycle {c} {parent=Clausius entropy} {wiki} TODO why it is optimal: https://physics.stackexchange.com/questions/149214/why-is-the-carnot-engine-the-most-efficient = Second law of thermodynamics {parent=Entropy} {wiki} = Second law {synonym} chapter 4 "Entropy and Probability" mentions well how first thought that the second law was an actual base physical law of the universe while he was calculating numerical stuff for it, including as late as 1872. But then he saw an argument by that given the