Visit additional Tabor Communication Publications
HPC Matters is a joint blog consisting of contributors from the Tabor Communications team on their observations and insights into HPC matters.
August 28, 2008
What happens when the physical world meets the virtual world and they fall so helplessly in love with each other that they sweep into each others' arms, get married, and meld into something entirely new and different? Aside from having angry Republicans knocking down the door about the whole definition of marriage thing, what happens is the Metaverse.
As you might guess, my metaphor needs some fine tuning (probably my sense of humor too), but bad humor and analogies aside, the concept of the Metaverse is worth your attention. Originally, the Metaverse concept came from the science fiction book "Snow Crash" by Neal Stephenson. In the book, the Metaverse was essentially the virtual realitization of the Internet – a place where people would gather in a virtual reality environment and interact socially and indeed, on a whole new cultural level.
A non-profit group called the Acceleration Studies Foundation (otherwise known as the ASF) has taken the Stephenson concept of the Metaverse and built on it based on key trends that they see emerging in technology today. As such, they see the Metaverse as "the convergence of 1) virtually-enhanced physical reality, and 2) physically persistent virtual space." In other words, the Metaverse isn't just an online phenomenon – it goes both ways. It's the bleeding of each environment (physical and virtual) into each other to create a whole new thing – what the ASF calls "the junction or nexus of our physical and virtual worlds."
To give tactility to the concept of the Metaverse, the ASF highlights two major paradigms in which people currently (and for the foreseeable future) participate in, which serve to make up base components of it. These two major paradigms include 1) External vs. Intimate, and 2) Augmentation vs. Simulation.
Within a matrix of these paradigms (as noted above), the ASF highlights the four key components that make up the early Metaverse. They include 1) Virtual Worlds (Second Life, Millions of Us, World of War Craft), Mirror Worlds (Google Earth, GPS, and other "location-aware" technologies), Augmented Reality (iPhones, wearable computers, RFID, etc.), and what is called Lifelogging (which starts with blogging, Facebook and MySpace, but progresses into some incredibly invasive forms of documenting one's life, including Twitter and moving towards automatic GPS tracking, wearable surveillance cameras, and more.
The so-called Metaverse exists in the space where these four paradigms overlap to create a mutually-reinforcing user experience -- if not a completely new cultural paradigm and way of existing. This should get your attention. As the ASF puts it:
"If these technologies become as commonplace and important as we believe they will, people who choose not to participate may end up as left out of commercial and civic discourse as Web-ignorant people are today."
If that doesn't make the hair stand up on the back of your neck, I don't know what will. Not that I'm insisting that everyone should be afraid of this apparent baptism into technology that we appear poised to dive headlong into (emerging as something wholly different), but fear would be a natural reaction.
So what does this have to do with HPC? Well, aside from the obvious, which should require no explanation on this Web site (this stuff is going to require a whole lot of computing power, storage, etc.), I noticed something that I found rather troubling when I was reading the overview of the Metaverse Roadmap: there is virtually no representation in this emerging space from the HPC community at large, despite the fact that HPC will have to be a major component of the vertebral column that enables the entire thing.
After reading the Metaverse Roadmap overview, and reflecting that much of this information is stuff that I've intuitively guessed at since first marveling at the Internet, it dawns on me that we may be much closer to this hybrid reality than I had ever imagined. Perhaps the time to pay closer attention to developments in this emerging Metaverse context has finally come.
What say you?
Posted by Isaac Lopez - August 27, 2008 @ 9:00 PM, Pacific Daylight Time
Isaac Lopez is the Marketing Director for Tabor Communications.
No Recent Blog Comments
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.