Tag: utility computing
Last year Cornell University and Purdue University received funding from the National Science Foundation to undertake their MATLAB on the TeraGrid project. Since its inception a number of researchers have been making use of the resource and Cornell’s Center for Advanced Computing is demonstrating that the resource might have a permanent place in the TeraGrid resource provider collection in the future.
HPC in the Cloud talks at length with CEO of Eucalyptus Systems, Marten Mickos, about the company’s recent $20 million funding injection and what it means for the future of the open source and enterprise private cloud vendor. From vision to roadmap to HPC philosophy, Mickos shares his view of the past, present and future for the software firm.
A special report from Bio-IT World looks at cloud adoption in biotech.
High-end, public cloud computing offerings represent a convergence of grid and Internet technologies, potentially enabling workable new business models. Smaller, private clouds are a technical evolution that expands the ease of use and deployment of grids in more organizations.
Researchers at UC Berkeley have released a white paper that provides an in-depth analysis of the emerging cloud computing model. We asked two of the paper’s authors, David Patterson and Armando Fox, to elaborate on the findings.
Author says cloud approach is reliable enough for many apps, but warns about vendor lock-in.
Taking advantage of unused datacenter capacity with its partner institutions and providers, Parabon gives customers access to high-performance grid computing resources on demand. Its offerings combine elements of both cloud computing and traditional utility computing, but the company says it really offers grid software as a service.
The advent of cloud computing has drastically affected the product offerings and solutions by grid computing veterans. Everything is about flexibility, mobility, virtualization and, overall, being on-demand. However, after seeing how quickly a nebulous term can lose favor among the user community, vendors are betting on the delivery model but not necessarily the terminology.
A somewhat neglected aspect of the current financial crisis is the huge spike in trading volumes in recent days. In some cases, they have raised to more than double the average of the months (and years) that preceded the crisis. Systems, of course, don’t care if stocks are going up or down; they just need to handle the transactions. Cloud computing can help ensure they meet this task.
Cloud computing gives users access to massive computing and storage resources without their having to know where those resources are or how they’re configured, and, as you might expect, Linux plays a huge role.