Some thermodynamics derivations

This note is a tiny bit of thermodynamics foundation.

In this case energy can be any conserved quantity, a definition of temperature follows considering thermal equilibrium.

NOTE, it is obviously not entirely rigorously defined, and by extension derived..

Assume:

  • An ensemble has a set of states to be in with equal probability.
  • Ensembles have some value E, which can be exchanged, but the sum is constant.
  • Connecting two ensembles, we call thermal equilibrium when the two sets of states maximizing the number of states.

Connecting two independent ensembles, there is some total E=E1+E2 and the two ensembles have numbers of states to be in N(E1), N(E2). They’re independent so the combination has

N = N(E1)⋅N(E2)

And now we can just optimize that with the constraint. (simply filling in E2=E-E1)

dN/dE1 = N'(E1)⋅N(E-E1) - N(E1)⋅N'(E-E1) = 0

So that N'(E1)/N(E1) = N'(E2)/N(E2) lets consider exp(S)=N it simplies; N'=exp(S)S' or the equation:

exp(S1(E1))S'(E1)/exp(S1(E1)) = dS1(E1)/dE2 = dS2/E2 ≡ T

So as is a known equation, the derivative of entropy by energy is constant among ensembles in thermal equilibrium, and we call this constant the T.

We could call it temperature, but maybe we shouldn’t! After all, we have specified nothing other than constancy about the energy. Each extensive sum-conserved value about an ensemble, like E, has a intensive constant like T.

For instance, chemical potential has “T” here as μ/T and “E” the number of particles. The division by temperature is probably largely incidental. Possibly because it occurs equations like this one on internal energy.

Some notes on the assumptions.

Really, the optimization maximizing the amount of possibilities is about maximum likelyhood. There is thermal noise around it, and there might be awkward S(E) functions that have a lot of probability away from the maximum.

Also, things are discrete, limiting posibilities of redistributing E. Awkward cases might frustrate the derivative.