The word emergence describes when something comes to light that wasn’t known before, like the Mind Flayer in Stranger Things:
Something emerges from mist or fog, or some other substrate, and then it’s suddenly there. Maybe it was there all along.
This word—emergence—makes a great metaphor to describe any time when some unexpected complexity crops up. The key thing that defines emergence is that the component parts don’t seem to add up to all this complexity.
You can see this in the world of biology, where many of the 1030 individual life forms have organized into larger structures that seem to think for themselves.
Bee hives and ant colonies are more akin to individual creatures with objectives and minds than a collection of thousands of individuals. Each individual drone follows a really simple plan based on nature’s algorithm: if you see some food, let the others know; follow the drone in front of you; and so on. And yet, the hive itself has objectives like building a collective defense against predators, foraging systematically, and generally operating with the efficiency of a Fortune 500 company.
The simple rules of the drones and soldiers come together to create some surprisingly complex behavior. And don’t get the idea that the queen is up there making orders or anything like that—quite the contrary, each individual actor already knows exactly what to do, and the rules they follow are incredibly simple.
Biology provides lots of examples of this type of biological emergence, but nowhere is it more clear than with the original jump from single celled life to multicellular. For the first three billion years of life on Earth, all creatures consisted of a single cell. Everything was processed inside of one cellular membrane, and all the little organelles were contained in there too.
At some point less than a billion years ago, life forms with more than one cell emerged. Ever since then, there has been a growing contingent of multicellular life, including us. Single-celled life somehow organized itself into multicellular life, which then further organized itself into things like colonies and tribes and Fortune 500 companies.
The world of human behavior offers a few great examples of emergent properties. Nation-states are a relatively recent example, but people have been congregating in various groups that they want to represent their identity for tens of thousands of years. Tribes evolved into collectives, which eventually evolved into cities and kingdoms.
Does the individual American represent the ideals of America, for example? Hardly! We argue about everything here, and only begrudgingly concede anything to the other side, ever. And yet, here the United States is, projecting power out into the world with what appears to be one single voice (albeit occasionally schizophrenic).
We humans are now so used to organizing ourselves into new groups, we hardly even think about how new properties emerge. If you’ve ever been on a team where you played a sport with other people, you probably already understand that the team dynamic is considerably bigger than any one individual, and that the team can have different objectives than you (as an individual) had going in.
While we humans are subject to biological evolution and emergence, there’s a non-biological sort of emergence that’s beginning to grab headlines. Specifically, the idea that a large language model can have a conversation with you that (mostly) makes sense isn’t something anyone had on their bingo card for late 2022, but that’s what happened.
In spite of having the explicit goal to fill in the next word of every sentence, something akin to useful intelligence has emerged from generative AI programs. Now, please note that I’m very clear to say “something akin to useful intelligence.” I don’t want to muddy the waters by implying that I have a great definition for intelligence in the first place, but I will emphasize the useful portion here.
By using an ultra-simple rule (predictive text), you can now find information vastly quicker than ever before. Sure, you can have the LLM tell you jokes and compose sonnets and all that, but the real value for me is getting to the bottom of things quickly, so I spend half as much time to learn twice as much stuff.
There are a lot of intriguing things that could be on the emergence menu. While defining intelligence is difficult, intelligence itself seems to be one of those emergent properties, arising from a set of rules that is far less complex than itself.
Similarly, consciousness isn’t something that’s well understood by anyone, anywhere. Is it enough to say that you’re aware of yourself? If so, any system that can describe or observe itself can be said to be conscious. Whatever this feeling is, though, it’s an excellent candidate for something that emerges from simpler systems.
There are physicists out there who would go so far as to argue that all of the laws of physics are emergent properties of some initial state, and this is a tantalizing idea, but I’m not fully prepared to defend such a bold claim today. What does seem to be clear, though, is that complex phenomena can almost certainly be explained by far simpler rules, and that this happens all the time in nature.
Is emergence already on your radar? What types of emergence have you observed, and what do you think is likely to emerge in the future?
Three days ago I conducted my annual celebration of emergence.
Me, every time I start reading an Andrew article: "Ah, I am going to mention [relevant thing] about this topic in my comment."
Andrew: *mentions [relevant thing] in the article itself*
But yeah, emergence in LLMs. Good stuff.
Also, Tim Urban has a concept he calls "The Human Colossus" to refer to the emergent qualities of humanity as a whole, arising from all of us acting collectively within and across societies.