Visit additional Tabor Communication Publications
March 06, 2013
MARSHALLTOWN, Iowa, March — Mechdyne Corporation announced that it has licensed the CAVE2 hybrid reality environment developed by the Electronic Visualization Laboratory (EVL) at University of Illinois at Chicago. The licensing agreement was signed in January of 2013, and continues the strong working relationship that began in 1994 when Mechdyne licensed the EVL-designed original CAVE technology.
The term CAVE is an acronym for CAVE Automatic Virtual Environment and also a reference to "The Simile of the Cave" in Plato's Republic, in which the philosopher explored the concepts of perception, reality and illusion.
The CAVE2 system provides a near-seamless, 320 degree, panoramic 2D/3D environment that supports information-rich analysis with stunning immersive visuals. “Although the CAVE2 advanced virtual reality technology is a next generation system, many of the legacy CAVE applications can be integrated into the CAVE2 system,” said Kurt Hoffmeister, VP of Engineering and Product Development for Mechdyne.
“With Mechdyne as an integrator of the CAVE2 design, we can point our collaborators who are interested in incorporating the CAVE2 technology to Mechdyne,” said Jason Leigh, Ph.D., Director of the EVL and Professor of Computer Science at University of Illinois at Chicago. “This enables our colleagues to receive a top-notch product with full technical support from Mechdyne,” Leigh explained.
According to Hoffmeister, “One of the biggest benefits offered by the CAVE2 system is its versatility. The system can be integrated to provide the resolution and clarity that matches human visual acuity, for an entirely new level of immersive and collaborative experience.”
A key feature of the CAVE2 design is the use of 3D capable LCD flat panel screens rather than rear projection, which enables more efficient use of facility space, compared to previous generation CAVE technology, Hoffmeister explained. “The CAVE2 design also accommodates off-axis vertical viewing angles, eliminating any 3D ghosting at the top or bottom of the large viewing area,” he added.
The development of the CAVE2 technology was funded by grants from the National Science Foundation and Department of Energy. Some of the royalties of licensing of the CAVE2 system will help fund future research at EVL, explained Leigh.
“Every day we’re hearing more and more about ‘big data’,” Leigh commented. “Advanced visualization systems like the CAVE2 provide a lens to bring big data into focus — for deeper understanding and greater insight.”
Mechdyne expects major users of the CAVE2 system will include universities, scientific research organizations, energy companies, and manufacturing and design organizations worldwide.
About Mechdyne Corporation
Mechdyne is one of the world’s leading providers of innovative visual information technologies. The company bends technology in ways that transform complex data into insights and ideas. To ensure customers succeed, Mechdyne provides comprehensive, customized solutions that include consulting, software, technical services and hardware integration. Mechdyne, with offices around the world, serves a global customer base. Customers include: leading government laboratories, energy companies, universities, manufacturing and design firms, U.S. armed forces, and other users of visual information technologies.
About Electronic Visualization Laboratory (EVL) at University of Illinois at Chicago (UIC)
EVL is an internationally renowned interdisciplinary research laboratory specializing in the design and development of high-performance visualization, virtual-reality and collaboration display systems utilizing advanced networking. Established in 1973, EVL is a joint effort of UIC’s College of Engineering and School of Art and Design, and represents the oldest formal collaboration between engineering and art in the U.S. EVL's mission is to enable scientific discovery through interdisciplinary collaboration, in which computer scientists work with domain scientists and artists to create useful and usable visualization and virtual-reality tools and techniques to help solve real-world problems.
Contributing commentator, Andrew Jones, offers a break in the news cycle with an assessment of what the national "size matters" contest means for the U.S. and other nations...
Today at the International Supercomputing Conference in Leipzing, Germany, Jack Dongarra presented on a proposed benchmark that could carry a bit more weight than its older Linpack companion. The high performance conjugate gradient (HPCG) concept takes into account new architectures for new applications, while shedding the floating point....
Not content to let the Tianhe-2 announcement ride alone, Intel rolled out a series of announcements around its Knights Corner and Xeon Phi products--all of which are aimed at adding some options and variety for a wider base of potential users across the HPC spectrum. Today at the International Supercomputing Conference, the company's Raj....
Jun 19, 2013 |
Supercomputer architectures have evolved considerably over the last 20 years, particularly in the number of processors that are linked together. One aspect of HPC architecture that hasn't changed is the MPI programming model.
Jun 18, 2013 |
The world's largest supercomputers, like Tianhe-2, are great at traditional, compute-intensive HPC workloads, such as simulating atomic decay or modeling tornados. But data-intensive applications--such as mining big data sets for connections--is a different sort of workload, and runs best on a different sort of computer.
Jun 18, 2013 |
Researchers are finding innovative uses for Gordon, the 285 teraflop supercomputer housed at the San Diego Supercomputer Center (SDSC) that has a unique Flash-based storage system. Since going online, researchers have put the incredibly fast I/O to use on a wide variety of workloads, ranging from chemistry to political science.
Jun 17, 2013 |
The advent of low-power mobile processors and cloud delivery models is changing the economics of computing. But just as an economy car is good at different things than a full size truck, an HPC workload still has certain computing demands that neither the fastest smartphone nor the most elastic cloud cluster can fulfill.
Jun 14, 2013 |
For all the progress we've made in IT over the last 50 years, there's one area of life that has steadfastly eluded the grasp of computers: understanding human language. Now, researchers at the Texas Advanced Computing Center (TACC) are utilizing a Hadoop cluster on its Longhorn supercomputer to move the state of the art of language processing a little bit further.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
Join our webinar to learn how IT managers can migrate to a more resilient, flexible and scalable solution that grows with the data center. Mellanox VMS is future-proof, efficient and brings significant CAPEX and OPEX savings. The VMS is available today.