Visit additional Tabor Communication Publications
November 03, 2010
One of the key roles of HPC in the biologic sciences is the creation of realistic "virtual" models and prototypes that allow experiments to be conducted without using actual organic tissue. It should come as no surprise that this degree of fidelity requires enourmous amounts of computational power and storage space.
Enter Dr. Ion Moraru, who has the perfect background to help usher in the age of computational biology. Trained as a medical doctor, Moraru transitioned into biology research and then into computer science. His first job at the University of Connecticut focused on experimental research into intracellular signaling mechanisms. His work in cell biology turned up a need for mathematical modeling of intracellular behavior, he says, which led to a more general interest in the theoretical modeling of cell biological phenomena.
Soon Moraru was collaborating with Dr. Leslie M. Loew, in the Department of Cell Biology, on a project called the "National Resource for Cell Analysis and Modeling." Loew was on the forefront of the nascent fields of computational cell biology and systems biology. The influence of computers was about to transform the science of biology. Moraru and Loew, together with James Schaff, an inventive computer science engineer, began working on a general-purpose program that could create realistic simulations of intracellular processes.
From the article:
Called the "Virtual Cell" (VCell), their notion soon blossomed into a remarkable computational tool enabling scientists to model and simulate cell biology through a platform that includes sophisticated distributed software and hundreds of servers: some that compute, some that store information, and some with software that can handle the massive calculations necessary to model and simulate cellular processes. Within a few years, VCell became so powerful that it could be used for everything from evaluating scientific hypotheses and interpreting experimental data to the creation of multi-layered models in which scientists evaluate the behavior of complex systems.
VCell developed into one of the leading platforms for kinetic modeling and simulation of molecules. Its Web-based, client-server design allows biologists to create powerful simulations without needing to have their own expensive computing hardware or complex software expertise.
Moraru was selected to lead the development of a dedicated computational infrastructure that would enable VCell to support global research. First established in 1999, the system has grown from three racks of processors, roughly the size of a short row of office file cabinets, to a multi-million dollar hardware portfolio. Now, the resource that Moraru has created supports over 2,000 scientists from around the world.
The project has recently received $500,000 worth of new equipment, mainly blade servers, purchased with an NIH grant. The nature of the blade technology allows for basically unlimited future growth. Moraru says that's a good thing because they expect more computational power will be needed.
Full story at the University of Connecticut
Contributing commentator, Andrew Jones, offers a break in the news cycle with an assessment of what the national "size matters" contest means for the U.S. and other nations...
Today at the International Supercomputing Conference in Leipzing, Germany, Jack Dongarra presented on a proposed benchmark that could carry a bit more weight than its older Linpack companion. The high performance conjugate gradient (HPCG) concept takes into account new architectures for new applications, while shedding the floating point....
Not content to let the Tianhe-2 announcement ride alone, Intel rolled out a series of announcements around its Knights Corner and Xeon Phi products--all of which are aimed at adding some options and variety for a wider base of potential users across the HPC spectrum. Today at the International Supercomputing Conference, the company's Raj....
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
Join our webinar to learn how IT managers can migrate to a more resilient, flexible and scalable solution that grows with the data center. Mellanox VMS is future-proof, efficient and brings significant CAPEX and OPEX savings. The VMS is available today.