Visit additional Tabor Communication Publications
October 11, 2012
SAN DIEGO, Oct. 11 — The San Diego Supercomputer Center (SDSC) at the University of California, San Diego, will house the data repository for a new project funded by the National Institutes of Health (NIH) aimed at accelerating the study of metabolomics, an emerging field of biomedical research that studies chemical processes that could help more clearly define the mechanisms underlying diseases, such as diabetes and obesity, and develop new strategies for treatment.
SDSC joins other UC San Diego research units and organizations, which were awarded $6 million over five years out of a larger NIH metabolome program investment. Shankar Subramaniam, a Distinguished Scientist with SDSC and chair of the Department of Bioengineering at the UC San Diego Jacobs School of Engineering, is principal investigator (PI) for the project. Subramaniam is also an associate director of the UC San Diego Institute of Engineering in Medicine.
"I'm very excited about the prospect of collaborating with researchers in the Jacobs School, the School of Medicine, the San Diego Supercomputer Center, and others across the campus and the country," said Subramaniam. "This work will lead to a systematic understanding of human physiology at the molecular level."
Specifically, metabolomics is the study of small molecules called metabolites, found within cells and biological systems. Metabolites are produced or consumed in the chemical reactions that take place in the body to sustain life. The sum of all metabolites at any given moment – the metabolome – is a form of chemical readout of the state of health of the cell or body.
"This project builds on SDSC's long-standing collaboration with Professor Subramaniam and our world-class expertise in scientific data management," said SDSC Director Michael Norman, co-PI on the project. "That we will become a national data hub for other NIH-funded metabolomics projects makes this exciting for us."
The metabolome project will provide insights into the millions of microorganisms living within us. The human body contains many more bacterial cells than human cells, and the project will provide new opportunities for researchers to understand the role that microorganisms living within the body play in human health, according to Subramaniam, who has extensive experience integrating "omics" data as well as experience coordinating other large-scale projects.
One of the expected outcomes of the NIH project is the ability to "metabo-type" individuals to get a detailed picture of their current metabolite profile, and recognize problems, such as insulin resistance. The effects of interventions, such as changes in diet and exercise as well as pharmaceuticals, could then be seen in updated metabo-type readings.
Through the SDSC data repository, bioengineers and other researchers at UC San Diego will organize and present all data from the three metabolome core centers across the country, as well as other metabolomics efforts. The data repository, along with a coordination center, will serve as a coordinating hub so that the awardees can function as a consortium.
This metabolomics project at UC San Diego is an extension of the successful Lipid Maps project. Lipids are just one metabolite, and the metabolomics work will extend researchers' view beyond this metabolite to others, such as sugars, nucleic acids, amino acids, and hormones.
The $6 million in funding is part of an overall $51.4 million investment by the NIH in metabolomics. The awards are supported by the NIH Common Fund, which is taking a comprehensive approach to increasing the research capacity in metabolomics by funding a variety of initiatives in this area, including training, technology development, standards synthesis, and data sharing capability for this new field.
"We are excited about the potential advances in technology that will enable metabolomics analysis to be conducted in basic and clinical settings, resulting in the discovery of new diagnostic tools and yielding important clues about disease mechanisms," said James M. Anderson, director of the NIH Division of Program Coordination, Planning and Strategic Initiatives, which oversees trans-NIH program areas, including those supported through the NIH Common Fund. "The new cross-cutting metabolomics initiatives will allow for better data sharing and coordination of metabolomics efforts both nationally and internationally."
Regional Comprehensive Metabolomics Resource Cores
In addition of the Data Repository and Coordination Center award to UC San Diego, the NIH has awarded three Regional Comprehensive Metabolomics Resource Cores, aimed at increasing the national capacity to provide metabolomics profiling and data analysis services to investigators. They are:
For more information, see http://commonfund.nih.gov/Metabolomics/fundedresearch.aspx. More information about the Metabolomics Program is at http://commonfund.nih.gov/Metabolomics/.
About the National Institutes of Health
NIH, the nation's medical research agency, includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. NIH is the primary federal agency conducting and supporting basic, clinical, and translational medical research, and is investigating the causes, treatments, and cures for both common and rare diseases.
As an Organized Research Unit of UC San Diego, SDSC is considered a leader in data-intensive computing and all aspects of 'big data', which includes data integration, performance modeling, data mining, software development, workflow automation, and more. SDSC supports hundreds of multidisciplinary programs spanning a wide variety of domains, from earth sciences and biology to astrophysics, bioinformatics, and health IT. With its two newest supercomputer systems, Trestles and Gordon, SDSC is a partner in XSEDE (Extreme Science and Engineering Discovery Environment), the most advanced collection of integrated digital resources and services in the world.
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.