What will the next hundred years look like?
That’s probably something that no one person can answer. Fortunately for you, I’ve brought in a second person to help us think through this most difficult of questions, so we should be all set!
Please welcome
to Goatfury Writes for the day! Scoot writes for Gibberish, and together we have had a great time speculating about future technologies over on Notes.I’ll mention a few things, and then I’ll pass the mic over to him for a bit, and then I’ll try to draw some conclusions based on what we both think.
Here goes!
My own framework for the future is largely based on the concept that technology builds on technology, and in the grand cycle of innovation, everyone stands on the shoulders of giants. This means that new technology uses the lessons learned from the past, broadly speaking, so it doesn’t need to take as long to make the next thing in line.
Going back far enough, you can see a trend where each subsequent innovation paradigm (I’ll try to explain what that means in a bit) is shorter than the last one. This has held up surprisingly well over the entire history of humanity, going all the way back to the first tool we used for a long, long time: the hand axe.
Here’s the TL;DR: We used hand axes to build other tools, and then we used those other tools to build other, still better tools. The better tools meant we could build new things we couldn’t have built before, and it also meant that the time to come up with the next big thing was almost always shorter and shorter.
Think about how it took us about 3 million years to get through the Stone Age. By several thousand years ago, much of humanity was moving on to the bronze age, where most tools were made of metal instead of stone. Naturally, those bronze tools could make better tools, so the Bronze Age only lasted a couple thousand years, depending on where you look.
The Iron Age that followed saw increasingly complex tools created. By the time of the Industrial Revolution, people were using the knowledge of the Scientific Revolution to make new paradigms even faster.
Consider communication: we developed visual art as early as 50,000 years ago. Writing came to be more like 5000 years back. The ability to represent things with art was a necessary precursor for writing.
Writing, in turn, was a necessary precursor for the printing press, which (about 500 years back) represented a new paradigm for spreading knowledge. The knowledge that writing helped preserve, helped to lead to the printing press.
The printing press, in turn, helped to preserve and spread knowledge like never before. Within the last 50 years, the internet has been taking over as the new paradigm for disseminating information. It makes the printing press look like a frozen snail by comparison.
An astute observer will note that the time intervals between these important paradigms kept getting shorter. That’s true today within the microcosm of computing power, most famously expressed through Moore’s Law, the idea that the number of transistors on a chip roughly doubles every two years.
All of this is to say that the next hundred years will probably see more change, technologically speaking, than the previous thousand years put together.
And now, I’m gonna pass the mic to Scoot, who can do anything he likes.
Within reason.
Vladimir Lenin once said: “There are decades where nothing happens, and there are weeks where decades happen.” This is my guiding principle for technological change. Andrew is absolutely correct that technology builds on itself with compounding interest–but how does that change happen? Did humanity spend 5,000 years trying to figure out the perfect hand-axe? Was there someone diligently working on the printing press for the 4,500 years between the advent of writing and the final publication of the Gutenberg Bible? As Hemingway said of bankruptcy, these things happened gradually, and then suddenly.
I think the term to describe this kind of technological advancement is a “phase change”. In Chemistry, a phase change is when some stimulus causes some thing (solution, chemical, substrate, what have you, I’m not a chemist) to change phases, and then it will stay in that phase until some new stimulus is added and it changes again.
What kind of things stimulate human technological achievement? A dominant force has been raw necessity, though how that necessity is defined has probably slackened across the millenia. Humanity first hewed the hand-axe from stone because a man’s gotta eat. The printing press arose out of political and/or religious necessity–education had reached a critical point where people could read and people WANTED to read, and books, for the first time in history, needed to be produced at scale. War–the 20th century saw more technological advancement than in any previous century due to wartime necessity–which, at a human scale, is dubiously comparable to the necessity that drove the first hand-axe.
So what drives necessity now?
We don’t fight wars the way we did in the 20th century–it’s all proxy wars and economic pressure and multinational alliances. Our social circumstances are dramatically different than ever before in human history. Our cultural circumstances are dramatically different too.
Necessity will reveal itself gradually, and then suddenly. I believe technological advances in the next hundred years will be defined by marginal advances in areas that help us gain some minute advantage over our neighbors. An economic innovation, a communication innovation, a transportation innovation–these will happen slowly.
When some phase changes happen–of which there are a few possible on the hundred-year-horizon–all scripts are out the window, and we will have weeks where decades happen. Economic crises will necessitate economic innovation; demographic collapse will necessitate socio-cultural innovation; political crises or even wars will necessitate political or military innovation. The technology will be developed suddenly that will restore some global equilibrium and resolve whatever necessity gave rise to it.
But because our current “phase” of technology is sufficiently advanced, each new phase-change accelerates new technologies faster and farther than ever. Andrew is right that the next hundred years could see more technological change than the last thousand years together–the big question is this: What will cause that change?
This was a fun thought experiment and it was great to work with you on this! Thanks Andrew!
Who says all the innovation is going to be the kind of thing normally associated with technology, anyway? I keep unironically thinking about the question of what a human even is to begin with because I almost see all this emphasis on technological change as being an attempt to balance against a change coming from nature. I think the change coming from nature is a good thing and the proposed technological attempts to cancel it are a bad thing. They are always always always paired with "humanity this, humanity that." Reminds me of an article I read here about a Justice League show that said, "bad guys save the world, good guys save people."
I'm not remotely against technology, though. I just think technology can be used for different ends and people are trying to use it to prevent natural growth like a lich trying to fill a coffer with souls and maintain a state of undeath. That's not the only use for fantasy technology/magic (which is literally just technology people don't understand the mechanics behind in lots of older works of fiction.) Forget phase changes, what about punctuated equilibrium? That's not so much of a positivist metaphor but that's where my mind is. I find it difficult to forget that the environmental conditions for humanity have never been like they are now, either, and that's in multiple ways. The obvious one is climate change, but the other ones are more interesting and relevant in my opinion.
Just technology isn't limited to mechanical and chemical things even if you are only looking at technology, but I think the changes to the natural environment going on are more interesting than the technology being made. This is despite and perhaps also because I'm trying to work on some technology myself. Next-word prediction algorithms are not very interesting and it's also completely unsurprising when they melt down when you yourself are slowly trying to make an AI that uses generative grammar models, for instance, and then do some measurements based on biophysics hypotheses to see how well it potentially lines up with human cognition and hypothetical other types of cognition. I guess I see technology as more of a byproduct than a driver of other changes. If that byproduct isn't there you're badly off but the real origin of progress is internal and mental in my view. Basically what I said about computing, AI is a buzzword because all computing is about intelligence, that's what distinguishes a computer from a machine like an engine whose purpose is locomotion for example.