Visit additional Tabor Communication Publications
October 22, 2009
Here is a collection of highlights from this week's news stream as reported by HPCwire.
Mass. Governor Unveils Plan to Build HPC Center by 2011
Georgia Tech Wins NSF Award for Next-Gen Supercomputing
Francine Berman Wins Ken Kennedy Award
NVIDIA RealityServer Propels 3D Cloud Computing Using GPUs
Appro Deploys Supercomputing Cluster to LLNL
Voltaire, IBM Expand OEM Partnership
DOE Awards $3.3M Contract to DICE Program
UK Science to Benefit from Further Investment in HPC
U of Houston Chooses Bright Cluster Manager
T-Platforms Joins the HPC Advisory Council
RIT Scientists Use Supercomputers to 'See' Black Holes
First Asia Top500 List Announced
Quanta and Tilera Partner on Cloud Computing
NSF's Cyber-Network Expands Across the Northern Hemisphere
IDC Sees Clouds as Stop-Gap Measure
Computerworld covers an IDC report that states that cloud computing is only viable as a stop-gap measure to be used until companies have the time and money to develop in-house computational resources. The IDC survey showed that big companies found that running software-as-a-service was no longer cost-effective past the three-year mark.
"Cloud costs need to come down much further to be a realistic long term option," said Matthew McCormack, IDC analyst, at the company's recent Cloud Computing Summit in London. "It could be useful in the short term financially for companies with severe cost overruns."
"Your datacentre would have to be really poorly run for it to be more expensive than cloud in the long run," he added.
As cloud costs come down, the situation will likely change, with cloud computing becoming more economically feasible. Still, it's important to be aware of the main concerns, i.e., vendor lock-in, security issues, service levels (availability) and bandwidth limitations.
Crystals for Supercomputer Storage
The BBC reports on research being done at the University of Edinburgh, which has been studying salt crystals as a way to create computers with massive storage capacity. Dr. Alexander, of the university's school of chemistry, who developed the technique, said:
"This research builds on a discovery that was made by accident many years ago, when it was found that light can be used to trigger crystal formation. We have refined this technique and now we can create crystals on demand. There is much work to be done before these crystals can be used in practical applications such as optical storage, but we believe they have significant potential."
The process of creating crystals from a salt solution is difficult to control and historically has been looked at as a near-impossible task. But researchers have overcome this difficulty by using two low-energy lasers focused on the solution, providing exactly the right amount of energy to trigger the chemical process.
The crystals, being 3D structures, would improve upon traditional flat surface storage mediums such as CDs. The development, which could allow users to store a terabyte of data in a space the size of a sugar cube, could be market-ready in 10 years.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.