Tag: distributed computing
<img style=”float: left;” src=”http://media2.hpcwire.com/hpccloud/Xoreax_IncrediBuild_150x.jpg” alt=”” width=”92″ height=”92″ />What if you could combine the benefits of virtualization, grid and cloud computing to accelerate Windows-based applications? An Israeli company, Xoreax, is doing just that. We spoke with Xoreax at SC12 in Salt Lake City, Utah, last week to learn more about their offering.
After a successful five-year run, Sony is ending its participation with Stanford University’s Folding@home project.
Portland-based CPUsage operates kind of like the volunteer computing grids, except the startup pays users for their “extra” compute power.
As participants from around the world make their way to Prague for the EGI Technical Forum, grid-enabled tools continue to facilitate global collaboration. Grid computing provides the backbone for a wide range of research, all the way from basic science to once-in-a-lifetime breakthroughs, like the recent achievements surrounding the elusive Higgs boson particle.
The possibility of resource-constrained HPC workloads being able to tap into virtually limitless cloud-based resources is enticing to say the least, but in practice, they haven’t been a perfect match. Where HPC workloads run best on a customer-tailored infrastructure, most clouds are general-purpose. A partnership from iSuperGrid and Mirantis aims to address this challenge by marrying the flexibility of cloud computing with the complexities of HPC.
The latest volunteer supercomputing grid, Charity Engine, has a new twist on how to make the world a better place.
Calit2’s Larry Smarr examines the implications of an increasingly-networked world.
For decades, the distributed computing architecture now known as cloud suffered a branding problem. Will the cloud label suffer the same fate?
Culling together massive data has provided some profound opportunities for a wide array of analytics projects but has created a number of complications for those who want to gain actionable intelligence from it. While the “big data” movement is still unfolding, a number of companies have emerged to help simplify access and use, especially of unstructured information. HPC stalwart Platform Computing entered the race to refine handling of vast datasets — not to mention the management behind such operations to stake their claim in this emerging space.
CERN’s Worldwide LHC Computing Grid is the superhighway for particle physics data.