The hum of ENIAC was a familiar background noise in the vast room, almost comforting to the technicians who worked on this titanic machine.
Just now, that reassuring hum was interrupted by an abrupt silence, followed by an unmistakable pop.
The technician’s heart sank. A vacuum tube had burned out again. She took a deep breath, knowing that each failure meant hours of meticulous work ahead. There were more than 17,000 of these tubes.
She glanced at the area of the computer where she thought the sound had come from, the multiplier unit. It was one of the more temperamental sections, with a history of tube failures.
Hours seemed to pass as her patience waned, but finally, there it was: a tube with darkened glass. This was the culprit—the cause of the interruption to the important work being done.
A wave of relief washed over her. She carefully removed the damaged tube, its glass still warm to the touch, and replaced it with a new one.
ENIAC Genesis
In the 1940s, the world was in flux. Technological advancement wasn’t just a matter of progress or a mark of pride, but a matter of survival, as the nations of the world tried to kill one another.
World War II was a war of computation and calculation as much as it was of boots on the ground and bravado. Artillery trajectory calculations in particular were essential any time you wanted to launch shells at the enemy, and these took hours—sometimes days—due to the tedious manual nature of the calculations.
The United States wanted every edge it could get in the war, and so two electrical engineers named John Mauchly and J. Presper Eckert were tapped at the University of Pennsylvania to create something new to the world: a computer that wasn’t mechanical, that didn’t rely on gears turning and physical components moving around in order to give an answer.
This was a leap upward, akin to the shift from pigeons carrying information to the telegraph. While mechanical computers were capable of 3 or 10 calculations per second, ENIAC could do 5000.
Integrating thousands of vacuum tubes into a massive 1800 square foot machine, considerably bigger than my house, was no small feat. Nothing like this had ever been attempted.
Fortunately, after a relentless candle-burning sprint, ENIAC was completed. Unfortunately, the war was over by then.
These Tubes Didn’t Come Out of a Vacuum
ENIAC certainly stood on the shoulders of giants.
The world of 1940s computing was akin to a bustling metropolis, where each invention and innovation was a building block, laying the foundation for the next. I mentioned mechanical computers—the Harvard Mark I stands out. Mark I was electromechanical, meaning the mechanical components were driven and mitigated by the electrical ones.
Rotating shafts, gears, and wheels that physically moved during operations were stock and trade of this beast. In order to add, one set of gears might turn another set a specific number of times to represent the numbers being added together. The movement of these mechanical parts would, in turn, produce a result.
All this gear turning led Mark I to produce ten calculations per second, a far cry from ENIAC’s 5000.
Meanwhile, Vannevar Bush was going down a very different path in another part of Massachussets. Bush had built a fully analog machine, with no electrical parts whatsoever. While its scope was limited, its influence was vast. Bush developed his Differential Analyzer during the 1920s, and by 1935 it was ready to rock. It remained the most powerful computer until ENIAC surpassed it.
Bush’s own ideas stood on the shoulders of a giant: Charles Babbage. Babbage was far ahead of his time, having designed a working mechanical computer in the 1830s he called the Difference Engine. “Design” is the key word here, since his design was too complex for the technology at the time to actually build it… but someone did build a Difference Engine in 1991, and it worked.
Clearly, ENIAC was a crowning achievement in this unbroken line, from Babbage to World War II. Its design and capabilities represented a leap from the analog to the digital, from the mechanical to the electronic. And in doing so, it set the stage for the computer revolution that would follow.
Liftoff
With ENIAC as the precursor, the 1950s and '60s witnessed an explosion of computer development. Transistors replaced vacuum tubes, leading to smaller, more reliable, and efficient machines. Suddenly, there was a commercial market for computers, albeit a small one at first.
Vacuum tubes were replaced by the transistor, an incredible leap forward. Transistors didn’t burn out, and they were much, much faster than vacuum tube networks. Gradually, transistors became smaller and smaller. William Shockley, Robert Noyce, Gordon Moore, and a few others founded Fairchild Semiconductor, and one of the so-called Fair-children was Intel.
Software advancements kept pace with hardware. The concept of "stored programs" became more refined, leading to the development of early programming languages like FORTRAN and COBOL. The legacy of Ada Lovelace continued as computers became capable of more and more things, gradually transforming them from specialized calculators to general-purpose machines.
As decades passed, personal computers entered homes, democratizing access to technology. The 1980s saw giants like Apple and Microsoft rise, introducing user-friendly interfaces and operating systems. The advent of the internet in the 1990s connected these machines globally, setting the stage for Web 2.0 and the rise of social media.
Today, the legacy of ENIAC is everywhere.
Your smartphone is a digital computer, and ENIAC was the first of this kind. Of course, the phone in your pocket is capable of performing billions of calculations per second. Supercomputers today, and cloud computing networks that link machines together, are vastly more powerful than what Eckert and Mauchley probably thought possible.
While the machines of today are worlds apart from the room-sized behemoth that was ENIAC, the spirit of innovation, the quest for efficiency, and the dream of expanding human potential remain constants.
In the span of just a few decades, we've journeyed from the hum of vacuum tubes in a vast room to the silent, immense power of cloud computing and artificial intelligence. As far as we’ve come, though, it should be crystal clear that the journey that began with ENIAC is far from over.
Nice historical contextualization!
When I was in college in 1977 we used to shoot the shit with our freshman chemistry prof during office hours. He'd tell stories of his work with early computers. There were like 15000 tubes in them and the typical lifetime of a tube was about 15000 hours so these things would go down like every hour. A tech would crawl in replace the tube, it would go back up and crash down a bit later.
There was another very impressive computer called colossus (the real one not the sci fi version) during WW2. It was kept secret for 30 years and so never played a role in computer development.
https://en.m.wikipedia.org/wiki/Colossus_computer#