The 19th century was an exciting century for physics and engineering. For the first time in human history, we could build machines complicated enough to push the boundary of science forward, helping us to prove certain ideas about how things worked right or wrong.
These scientific ideas could be used to develop new theories, which could then be tested by even more sophisticated equipment. Ever since then, we’ve used this virtuous cycle of innovation and discovery nonstop, creating the modern world around us.
One of the most exciting areas at the juncture of advanced physics and engineering was the growing study of hot things moving around. The invention at the very center of the industrial revolution was at the heart of this focus.
The steam engine relied on tiny, hot molecules of water flying around at very high speeds, banging into a metallic turnstile to generate power that could be used to transport goods or power machines. Suffice it to say, maximizing the use of this power was all-important.
Unfortunately, steam engines were notoriously inefficient, and given the tremendous investments going into them, this was a very important problem to solve. The level of money being thrown into steam power is kind of like today’s AI gold rush or 1999’s dot-com mania, only it lasted ten times longer.
A name was needed for this new field that studied the movements of hot things, so the usual suspects, Greek and Latin, were explored first. Greek was the winner in this round, as thermodynamics was ultimately chosen.
Now, thermo- comes from a word that ultimately means heat or hot. You might think that dynamics means something like movement, giving me the title of today’s piece.
I’m sad to report two things about this title, Hot Movements. First, this is not about poop. If you were hoping for those sorts of movements, I have you covered here.
Okay, maybe covered isn’t the best choice of words.
Second, dynamis—the Greek word from which dynamics derives—doesn’t really mean movements, exactly. It’s better translated as force or power. Perhaps this is understandable, since steam engines were awe-inspiring machines that could do the work of dozens or hundreds of humans. Certainly in Victorian England, where the study was named, the power of steam engines was on everyone’s mind.
Today, we have a grasp of what we call the laws of thermodynamics. These principles are still commonly used today, largely because it would be unwieldy to try to measure individual particles in a system, even if you had the technical ability to do that one at a time. Instead, thermodynamics describes the whole system in action.
These laws describe what happens in a closed system, starting with the fundamental idea that energy can’t be created or destroyed. When we’re talking about steam engines, we’re thinking about energy that could be lost due to waste heat. If you measure all the fuel going in and you don’t get the same amount of energy out, this meant you were wasting a lot of energy through leakage.
Imagine investing an enormous sum on a steam ship, then discovering that a third of your investment simply burned away for no productive reason, and you have some idea of why this science was so important for so long.
The second law describes entropy, a measure of the amount of disorder in a system. The short explanation here is that there are vastly more disordered states than ordered states in nature, and hot movements—sorry, hot dynamics—ultimately cause the proverbial dice to roll over and over again, ending up with more disorder than order.
While the deeper explanation of what’s happening with entropy and the arrow of time is what fascinates me today, what fascinated investors nearly two centuries ago was not burning up so much coal.
Interestingly, what’s driving some of the bleeding-edge technological innovations today is that same drive for greater efficiency, only this time it’s silicon wafers instead of steel ships, and it’s information and not coal that we are trying to preserve.
There’s a much deeper story there, about how information is the currency of the day instead of energy. I’ll do my best to share that story with you soon.
Entropy is favored as a topic by SF writers simply because its central idea of "disorder" can be widely applied to both scientific and non-scientific settings.
Comparing the money thrown at thermodynamics to the energy being secured through small modular reactors for AI in recent contracts, e.g. Microsoft and Three Mile Island. Will the resulting info being produced magnify the energy needed to implement the products from the new data?