I have spent a lot of my career, both in Silicon Valley and beyond, insisting that all our technologies have histories and even pre-histories, and that far from being neat and tidy, those stories are in fact messy, contested, and conflicted, with competing narrators and meanings.
The metaverse, which graduated from a niche term to a household name in less than a year, is an excellent case in point. Its metamorphosis began in July 2021, when Facebook announced that it would dedicate the next decade to bringing the metaverse to life. In the company’s presentation of the concept, the metaverse was a thing of wonder: an immersive, rich digital world combining aspects of social media, online gaming, and augmented and virtual reality. “The defining quality of the metaverse will be a feeling of presence—like you are right there with another person or in another place,” Facebook founder Mark Zuckerberg wrote, envisioning a creation that would “reach a billion people, host hundreds of billions of dollars of digital commerce, and support jobs for millions of creators and developers.” By December 2021, a range of other large American technology companies, including Microsoft, Intel, and Qualcomm, had all articulated metaverse plans of their own. And by the time the Consumer Electronics Show rolled around in January, everyone seemed to have a metaverse angle, no matter how improbable or banal: haptic vests, including one with an air conditioner to create your own localized climate; avatar beauty makeovers; virtual delivery vans for your virtual home.
There has been plenty of discussion about the involvement of Meta (née Facebook) and its current complicated position as a social media platform with considerable purchase on our daily lives. There have also been broader conversations about what form the metaverse could or should take, in terms of technical capabilities, user experiences, business models, access, and regulation, and—more quietly—about what purpose it would serve and what needs it would fulfill.
“There is an easy seductiveness to stories that cast a technology as brand-new.”
These are good conversations to have. But we would be remiss if we didn’t take a step back to ask, not what the metaverse is or who will make it, but where it comes from—both in a literal sense and also in the ideas it embodies. Who invented it, if it was indeed invented? And what about earlier constructed, imagined, augmented, or virtual worlds? What can they tell us about how to enact the metaverse now, about its perils and its possibilities?
There is an easy seductiveness to stories that cast a technology as brand-new, or at the very least that don’t belabor long, complicated histories. Seen this way, the future is a space of reinvention and possibility, rather than something intimately connected to our present and our past. But histories are more than just backstories. They are backbones and blueprints and maps to territories that have already been traversed. Knowing the history of a technology, or the ideas it embodies, can provide better questions, reveal potential pitfalls and lessons already learned, and open a window onto the lives of those who learned them. The metaverse—which is not nearly as new as it looks—is no exception.
So where does the metaverse come from? A common answer—the clear and tidy one—is that it comes from Neal Stephenson’s 1992 science fiction novel Snow Crash, which describes a computer-generated virtual world made possible by software and a worldwide fiber-optic network. In the book’s 21st-century Los Angeles, the world is messy, replete with social inequities, sexism, racism, gated communities, surveillance, hypercapitalism, febrile megacorporations, and corrupt policing. Of course, the novel’s Metaverse is messy too. It too heaves with social inequities and hypercapitalism. Not everyone finds their way there. For those who do, the quality of their experience is determined by the caliber of their kit and their ability to afford bandwidth, electricity, and computational horsepower. Those with means can have elaborately personalized digital renderings. Others must make do with simple flat sketches, purchased off the shelf—the “Brandy” and “Clint” packages. Perhaps we shouldn’t be surprised that many who read the book saw it not just as cutting-edge science fiction but as a critique of end-stage capitalism and techno-utopian visions.
In the three decades that have passed since Snow Crash was published, many of the underpinnings of Stephenson’s virtual world, such as social networks and artificial intelligence, have materialized. And the metaverse, like other ideas foreshadowed in the cyberpunk tradition, has persistently found its way into broader conversation. It has featured in recent movies such as Ready Player One and Free Guy. And it has shaped much of the digital landscape in which we now find ourselves. However, I think there might be more to the metaverse than just Snow Crash and its current re-instantiation.
In fact, today’s conversations around the metaverse remind me a lot of the conversations we were having nearly 20 years ago about Second Life, which Philip Rosedale’s Linden Lab launched in 2003. Rosedale is very clear about the ways in which he was inspired by Snow Crash. He is also clear, however, that a trip to Burning Man in the late 1990s forever framed his thinking about virtual worlds, their inhabitants, and their ethos. Second Life was to be “a 3D online world created and owned by its users.” It was hugely successful—it dominated news headlines and conversations. Companies and brands fought to establish themselves in this new domain; we had conferences and concerts in Second Life, and even church. In the early 2000s, millions of people flocked to the platform and created lives there. Anthropologists studied them*; policy makers and politicians debated them. And the realities of a fully fledged virtual world collided quickly with regulators and policy makers; concerns about fiat currencies, money laundering, and prostitution all surfaced.
However, I think there are even earlier histories that could inform our thinking. Before Second Life. Before virtual and augmented reality. Before the web and the internet. Before mobile phones and personal computers. Before television, and radio, and movies. Before any of that, an enormous iron and glass building arose in London’s Hyde Park. It was the summer of 1851, and the future was on display.
Arc lights and hydraulic presses (powered by a hidden steam engine), electric telegrams, a prototype fax machine, mechanical birds in artificial trees, a submarine, guns, the first life-size and lifelike sculptures of dinosaurs, Goodyear’s vulcanized rubber, Matthew Brady’s daguerreotypes, even Britain’s first flushing public toilets. There were three stories’ worth of alcoves with red bunting and signs proclaiming each display’s country of origin, spread out over 92,000 square meters of gleaming glass enclosures—the Crystal Palace, as one satirical magazine dubbed it.
It was a whole world dedicated to the future: a world in which almost anyone could be immersed, educated, challenged, inspired, titillated, or provoked.
The Great Exhibition of the Works of Industry of All Nations, as the extraordinary event was formally known, was the brainchild of Prince Albert, Queen Victoria’s beloved consort. It would showcase more than 100,000 exhibits from all over the world. The queen herself would attend at least 30 times. In her opening speech, she made clear her agenda: “It is my anxious desire to promote among nations the cultivation of all those arts which are fostered by peace and which in their turn contribute to maintain the peace of the world.” The age of empire may already have been in decline, but the Great Exhibition was all about asserting power and a vision for Britain’s future. And what a modern, industrialized future it would be, even if colonies all over the world would be needed to make it happen.
Of course, London was a city already full of expositions and displays, places where you could visit the wondrous and strange. Charles Babbage was partial to Merlin’s Mechanical Museum, with its many automata. Others favored dioramas of the Holy Land and Paris. The Great Exhibition was different because it had scale, and the power of empire behind it. It wasn’t just a spectacle; it was a whole world dedicated to the future: a world in which almost anyone could be immersed, educated, challenged, inspired, titillated, or provoked. It was not little bits and pieces, but one large, imposing, unavoidable statement.
In its day, the Great Exhibition had many critics. Some worried about the ancient elm trees in Hyde Park that found themselves contained in the enormous structure. Others worried about the tensile strength of all that glass. In the press, there were months of ridicule, with one politician describing it as “one of the greatest humbugs, frauds, and absurdities ever known.” In the Houses of Parliament, some questioned Prince Albert’s motives, citing his status as a foreign prince and suggesting that the Great Exhibition was just a publicity exercise to encourage and perhaps mask the rise of immigration in Britain. Still others suggested that the Great Exhibition would attract pickpockets, prostitutes, and spies, and called for 1,000 extra police to be on duty.
Unsurprisingly, the dire warnings were overblown, and for a sunny summer, people from all over Britain—taking advantage of the rapidly expanding railway network—flocked to the massive glass house in the park. The organizers set entrance fees at a shilling, which made it accessible to the British working classes. “See the world for a shilling” was a common refrain that summer.
A surprising fraction of the literary and scientific community of the day found its way to the Crystal Palace. That roll call includes Charles Dickens, Charles Dodgson (who would become Lewis Carroll), Charles Darwin, Karl Marx, Michael Faraday, Samuel Colt, Charlotte Brontë, Charles Babbage, and George Eliot. Dickens hated it: it was just all too much rampant materialism, and his most recent biographer claims that his experiences there shaped all his work thereafter. Brontë, by contrast, wrote, “It seems as if only magic could have gathered this mass of wealth from all the ends of the earth—as if none but supernatural hands could have arranged it thus, with such a blaze and contrast of colours and marvelous power of effect.” Dodgson had such a moment when he entered the Crystal Palace. He wrote, “The impression when you get inside is of bewilderment. It looks like a sort of fairyland.”
And then, just like that, the Great Exhibition closed its doors on the 15th of October, 1851. Over its five-and-a-half-month run, it was estimated, over 6 million people visited the Crystal Palace (at the time, the total population of Britain was only 24 million). In its short life in Hyde Park, the Great Exhibition also turned a remarkable profit of some £186,437 (more than $35 million today). Some of it went to the purchase of land in South Kensington to create London’s current museum district. Another portion underwrote an educational trust that still provides scholarships for scientific research. The Crystal Palace was disassembled in the winter of 1851 and transported to a new site, where it would continue to showcase all manner of wonders until a cataclysmic fire in 1936 reduced it to a smoldering iron skeleton. And if the fancy takes you, you can still visit the Great Exhibition today, via a virtual tour hosted on the website of the Royal Parks.
The Great Exhibition kicked off more than a century of world’s fairs—spaces of spectacle and wonder that, in turn, would shape the world around them. In America, these world-making activities included the World’s Columbian Exposition of 1893, also known as the Chicago World’s Fair—a whole city with more than 200 purpose-built structures, whitewashed and gleaming, showcasing technologies as varied as a fully electrical kitchen with dishwasher, an electric chicken incubator, a seismograph, Thomas Edison’s kinetoscope, searchlights, Morse code telegraphy, multiphase power generators, moving walkways, and the world’s first Ferris wheel. Over one quarter of Americans would attend the World’s Fair in less than six months.