As I mentioned last time, all real, macroscopic processes increase the entropy of the universe and the way to quantify the entropy is to count the number of available arrangements of matter and energy in the system under consideration. We call a particular arrangement of matter and/or energy a “microstate” and the overall matter/energy content of our system a “macrostate.” Macrostates are characterized by average, macroscopic parameters like temperature, pressure, and volume. For a given system, like a fluid in a container, there will be a number of alternative “microscopic snapshots” consistent with a set temperature, pressure, or particle content--these are the microstates. A technical way of saying the second law is that a system spontaneously maximizes the number of microstates available to it. But first some mathematical and historical background. We use the variable “S” to represent entropy, and if we have a case which is simple enough that we can actually list all the possible microstates of the system, then we can calculate the entropy directly with the equation:
S = k*log(W)
This “microscopic” definition of entropy is due to Ludwig Boltzmann, and the equation is found on his tombstone in Vienna. The lowercase k is Boltzmann’s constant and it gives the right hand side the same units as entropy on the left: energy divided by temperature. I'm using an asterisk to indicate multiplication throughout this post. The log is just a log function and the W is the number of microstates. The thing to notice about the equation is that since k is a constant, the entropy only depends on the variable W, the number of microstates. That makes it sound easy but actually counting all of the microstates available to a real system is generally non-trivial. For people who like to hear the jargon, you have to “integrate over the entire phase space.” It can get messy. But before Boltzmann, entropy was only associated with the macroscopic meaning it has in classical thermodynamics.