The metaverse is a new word for an old idea

I have spent a lot of my career, both in Silicon Valley and beyond, insisting that all our technologies have histories and even pre-histories, and that far from being neat and tidy, those stories are in fact messy, contested, and conflicted, with competing narrators and meanings. 

The metaverse, which graduated from a niche term to a household name in less than a year, is an excellent case in point. Its metamorphosis began in July 2021, when Facebook announced that it would dedicate the next decade to bringing the metaverse to life. In the company’s presentation of the concept, the metaverse was a thing of wonder: an immersive, rich digital world combining aspects of social media, online gaming, and augmented and virtual reality. “The defining quality of the metaverse will be a feeling of presence—like you are right there with another person or in another place,” Facebook founder Mark Zuckerberg wrote, envisioning a creation that would “reach a billion people, host hundreds of billions of dollars of digital commerce, and support jobs for millions of creators and developers.” By December 2021, a range of other large American technology companies, including Microsoft, Intel, and Qualcomm, had all articulated metaverse plans of their own. And by the time the Consumer Electronics Show rolled around in January, everyone seemed to have a metaverse angle, no matter how improbable or banal: haptic vests, including one with an air conditioner to create your own localized climate; avatar beauty makeovers; virtual delivery vans for your virtual home. 

There has been plenty of discussion about the involvement of Meta (née Facebook) and its current complicated position as a social media platform with considerable purchase on our daily lives. There have also been broader conversations about what form the metaverse could or should take, in terms of technical capabilities, user experiences, business models, access, and regulation, and—more quietly—about what purpose it would serve and what needs it would fulfill.

“There is an easy seductiveness to stories that cast a technology as brand-new.”

These are good conversations to have. But we would be remiss if we didn’t take a step back to ask, not what the metaverse is or who will make it, but where it comes from—both in a literal sense and also in the ideas it embodies. Who invented it, if it was indeed invented? And what about earlier constructed, imagined, augmented, or virtual worlds? What can they tell us about how to enact the metaverse now, about its perils and its possibilities? 

There is an easy seductiveness to stories that cast a technology as brand-new, or at the very least that don’t belabor long, complicated histories. Seen this way, the future is a space of reinvention and possibility, rather than something intimately connected to our present and our past. But histories are more than just backstories. They are backbones and blueprints and maps to territories that have already been traversed. Knowing the history of a technology, or the ideas it embodies, can provide better questions, reveal potential pitfalls and lessons already learned, and open a window onto the lives of those who learned them. The metaverse—which is not nearly as new as it looks—is no exception. 

So where does the metaverse come from? A common answer—the clear and tidy one—is that it comes from Neal Stephenson’s 1992 science fiction novel Snow Crash, which describes a computer-generated virtual world made possible by software and a worldwide fiber-optic network. In the book’s 21st-century Los Angeles, the world is messy, replete with social inequities, sexism, racism, gated communities, surveillance, hypercapitalism, febrile megacorporations, and corrupt policing. Of course, the novel’s Metaverse is messy too. It too heaves with social inequities and hypercapitalism. Not everyone finds their way there. For those who do, the quality of their experience is determined by the caliber of their kit and their ability to afford bandwidth, electricity, and computational horsepower. Those with means can have elaborately personalized digital renderings. Others must make do with simple flat sketches, purchased off the shelf—the “Brandy” and “Clint” packages. Perhaps we shouldn’t be surprised that many who read the book saw it not just as cutting-edge science fiction but as a critique of end-stage capitalism and techno-utopian visions.

In the three decades that have passed since Snow Crash was published, many of the underpinnings of Stephenson’s virtual world, such as social networks and artificial intelligence, have materialized. And the metaverse, like other ideas foreshadowed in the cyberpunk tradition, has persistently found its way into broader conversation. It has featured in recent movies such as Ready Player One and Free Guy. And it has shaped much of the digital landscape in which we now find ourselves. However, I think there might be more to the metaverse than just Snow Crash and its current re-instantiation.

In fact, today’s conversations around the metaverse remind me a lot of the conversations we were having nearly 20 years ago about Second Life, which Philip Rosedale’s Linden Lab launched in 2003. Rosedale is very clear about the ways in which he was inspired by Snow Crash. He is also clear, however, that a trip to Burning Man in the late 1990s forever framed his thinking about virtual worlds, their inhabitants, and their ethos. Second Life was to be “a 3D online world created and owned by its users.” It was hugely successful—it dominated news headlines and conversations. Companies and brands fought to establish themselves in this new domain; we had conferences and concerts in Second Life, and even church. In the early 2000s, millions of people flocked to the platform and created lives there. Anthropologists studied them*; policy makers and politicians debated them. And the realities of a fully fledged virtual world collided quickly with regulators and policy makers; concerns about fiat currencies, money laundering, and prostitution all surfaced. 

However, I think there are even earlier histories that could inform our thinking. Before Second Life. Before virtual and augmented reality. Before the web and the internet. Before mobile phones and personal computers. Before television, and radio, and movies. Before any of that, an enormous iron and glass building arose in London’s Hyde Park. It was the summer of 1851, and the future was on display. 

Arc lights and hydraulic presses (powered by a hidden steam engine), electric telegrams, a prototype fax machine, mechanical birds in artificial trees, a submarine, guns, the first life-size and lifelike sculptures of dinosaurs, Goodyear’s vulcanized rubber, Matthew Brady’s daguerreotypes, even Britain’s first flushing public toilets. There were three stories’ worth of alcoves with red bunting and signs proclaiming each display’s country of origin, spread out over 92,000 square meters of gleaming glass enclosures—the Crystal Palace, as one satirical magazine dubbed it.

It was a whole world dedicated to the future: a world in which almost anyone could be immersed, educated, challenged, inspired, titillated, or provoked. 

The Great Exhibition of the Works of Industry of All Nations, as the extraordinary event was formally known, was the brainchild of Prince Albert, Queen Victoria’s beloved consort. It would showcase more than 100,000 exhibits from all over the world. The queen herself would attend at least 30 times. In her opening speech, she made clear her agenda: “It is my anxious desire to promote among nations the cultivation of all those arts which are fostered by peace and which in their turn contribute to maintain the peace of the world.” The age of empire may already have been in decline, but the Great Exhibition was all about asserting power and a vision for Britain’s future. And what a modern, industrialized future it would be, even if colonies all over the world would be needed to make it happen. 

Of course, London was a city already full of expositions and displays, places where you could visit the wondrous and strange. Charles Babbage was partial to Merlin’s Mechanical Museum, with its many automata. Others favored dioramas of the Holy Land and Paris. The Great Exhibition was different because it had scale, and the power of empire behind it. It wasn’t just a spectacle; it was a whole world dedicated to the future: a world in which almost anyone could be immersed, educated, challenged, inspired, titillated, or provoked. It was not little bits and pieces, but one large, imposing, unavoidable statement. 

In its day, the Great Exhibition had many critics. Some worried about the ancient elm trees in Hyde Park that found themselves contained in the enormous structure. Others worried about the tensile strength of all that glass. In the press, there were months of ridicule, with one politician describing it as “one of the greatest humbugs, frauds, and absurdities ever known.” In the Houses of Parliament, some questioned Prince Albert’s motives, citing his status as a foreign prince and suggesting that the Great Exhibition was just a publicity exercise to encourage and perhaps mask the rise of immigration in Britain. Still others suggested that the Great Exhibition would attract pickpockets, prostitutes, and spies, and called for 1,000 extra police to be on duty. 

Unsurprisingly, the dire warnings were overblown, and for a sunny summer, people from all over Britain—taking advantage of the rapidly expanding railway network—flocked to the massive glass house in the park. The organizers set entrance fees at a shilling, which made it accessible to the British working classes. “See the world for a shilling” was a common refrain that summer. 

A surprising fraction of the literary and scientific community of the day found its way to the Crystal Palace. That roll call includes Charles Dickens, Charles Dodgson (who would become Lewis Carroll), Charles Darwin, Karl Marx, Michael Faraday, Samuel Colt, Charlotte Brontë, Charles Babbage, and George Eliot. Dickens hated it: it was just all too much rampant materialism, and his most recent biographer claims that his experiences there shaped all his work thereafter. Brontë, by contrast, wrote, “It seems as if only magic could have gathered this mass of wealth from all the ends of the earth—as if none but supernatural hands could have arranged it thus, with such a blaze and contrast of colours and marvelous power of effect.” Dodgson had such a moment when he entered the Crystal Palace. He wrote, “The impression when you get inside is of bewilderment. It looks like a sort of fairyland.”

And then, just like that, the Great Exhibition closed its doors on the 15th of October, 1851. Over its five-and-a-half-month run, it was estimated, over 6 million people visited the Crystal Palace (at the time, the total population of Britain was only 24 million). In its short life in Hyde Park, the Great Exhibition also turned a remarkable profit of some £186,437 (more than $35 million today). Some of it went to the purchase of land in South Kensington to create London’s current museum district. Another portion underwrote an educational trust that still provides scholarships for scientific research. The Crystal Palace was disassembled in the winter of 1851 and transported to a new site, where it would continue to showcase all manner of wonders until a cataclysmic fire in 1936 reduced it to a smoldering iron skeleton. And if the fancy takes you, you can still visit the Great Exhibition today, via a virtual tour hosted on the website of the Royal Parks

The Great Exhibition kicked off more than a century of world’s fairs—spaces of spectacle and wonder that, in turn, would shape the world around them. In America, these world-making activities included the World’s Columbian Exposition of 1893, also known as the Chicago World’s Fair—a whole city with more than 200 purpose-built structures, whitewashed and gleaming, showcasing technologies as varied as a fully electrical kitchen with dishwasher, an electric chicken incubator, a seismograph, Thomas Edison’s kinetoscope, searchlights, Morse code telegraphy, multiphase power generators, moving walkways, and the world’s first Ferris wheel. Over one quarter of Americans would attend the World’s Fair in less than six months.

If the Great Exhibition had celebrated the power of steam, this so-called White City was all about electricity. It was also a branded landscape, supported and then aggressively promoted by American industry, with soon-to-be-familiar names like General Electric, Western Electric, and Westinghouse showcasing their technologies and their visions for the future—American democracy and American capitalism. Complicated conversations about gender and racial equality, and mythologizing of American exceptionalism and individualism, were everywhere on display. There was, for example, a building dedicated to the lives and times of American women, but not one for African-Americans, a point fiercely argued by Ida B. Wells and Frederick Douglass, who saw an opportunity to celebrate African-American accomplishments since the Emancipation Proclamation.

The White City also ushered in a new type of spectacle. At the Midway Plaisance, a mile-long stretch of park on the edge of the exposition site, you could see people on display in living dioramas, intermixed with dedicated sideshow activities, amusements, concessions, and food stalls. It was a violent, exciting mess of orientalism, exclusion, appropriation, and celebration. And it was far and away the most popular destination in the White City, generating a significant profit—$4 million in 1893 dollars, or well over $100 million today. 

The Midway would in turn inspire the creation of Coney Island in New York, and ultimately California’s Disneyland—a wholly different brand of imagined world. The influence of these kinds of events on our imaginations should not be underestimated. Just as there is a straight line from the Midway to Coney Island to Disneyland, there is a straight line from the White City to the 1939 New York World’s Fair to the Consumer Electronics Show. We can also draw a line between the Great Exhibition and today’s metaverse. Like the virtual world that the metaverse’s promoters promise, the Great Exhibition was a world within the world, full of the splendors of its day and promises about the future. But even as it opened up new spaces of possibility—and profit—it also amplified and reproduced existing power structures through its choices of exhibits and exhibitors, its reliance on the Royal Society for curation, and its constant erasure of colonial reality. All this helped ensure that the future would look remarkably British. The exhibition harnessed the power of steam and telegraphy to bring visitors to a space of new experiences, while masking the impact of such technological might; engines and pipes were hidden underground out of plain sight. It was a deliberate sleight of hand. If Brontë saw magic—not power, xenophobia, and nationalism—that was what she was intended to see.

To support MIT Technology Review’s journalism, please consider becoming a subscriber.

I think our history with proto-­metaverses should make us more skeptical about any claims for the emancipatory power of technology and technology platforms. After all, each of them both encountered and reproduced various kinds of social inequities, even as they strove not to, and many created problems that their designers did not foresee. Yet this history should also let us be alive to the possibilities of wondrous, unexpected invention and innovation, and it should remind us that there will not be a singular experience of the metaverse. It will mean different things to different people, and may give rise to new ideas and ideologies. The Great Exhibition generated anxiety and wonder, and it alternately haunted and shaped a generation of thinkers and doers. I like to wonder who will author this metaverse’s Bleak House or Alice in Wonderland in response to what they encounter there. 

The Great Exhibition and its array of descendants speak to the long and complicated human history of world-making. Exploring these many histories and pre-­histories can be generative and revelatory. The metaverse will never be an end in itself. Rather, it will be many things: a space of exploration, a gateway, an inspiration, or even a refuge. Whatever it becomes, it will always be in dialogue with the world that has built it. The architects of the metaverse will need to have an eye to the world beyond the virtual. And in the 21st century, this will surely mean more than worrying about ancient elm trees and the tensile strength of glass. It will mean thinking deeply about our potential and our limitations as makers of new worlds.

Genevieve Bell is director of the School of Cybernetics at the Australian National University in Canberra.

* Two lovely ethnographic accounts of Second Life grace my shelves: Tom Boellstorff’s Coming of Age in Second Life: An Anthropologist Explores the Virtually Human (2008) and Thomas Malaby’s Making Virtual Worlds: Linden Lab and Second Life (2009). The former is an excellent account of the early years of Second Life and the ways in which people loved and loathed that virtual world; the latter focuses on the technologists who built Second Life. Both give insight into the utopian visions that underpinned Second Life, and how they were experienced by participants and builders alike.