Tag: Oak Ridge National Laboratory

New Genomics Pipeline Combines AWS, Local HPC, and Supercomputing

Sep 22, 2016 |

Declining DNA sequencing costs and the rush to do whole genome sequencing (WGS) of large cohort populations – think 5000 subjects now, but many more thousands soon – presents a formidable computational challenge to researchers attempting to make sense of large cohort datasets. No single architecture is best. This month researchers report developing a hybrid Read more…

ORNL Researchers Create Framework for Easier, Effective FPGA Programming

May 24, 2016 |

Programmability and portability problems have long inhibited broader use of FPGA technology. FPGAs are already widely and effectively used in many dedicated applications (accelerated packet processing, for example), but generally not in situations that require ‘reconfiguring’ the FPGA to accommodate different applications. A group of researchers from Oak Ridge National Laboratory is hoping to change that.

How To Kill A Supercomputer – Tips from An Expert

Feb 24, 2016 |

Think your day is going badly? Let this article on IEEE Spectrum by Al Geist on the many ways that Supercomputers can crash lift your spirits. Geist, chief technologist for the computer science and mathematics division at Oak Ridge National Laboratory, has written a lively account of gremlins with a nasty tendency to gum up Read more…

ORNL Shows Off Titan on Periscope

Nov 5, 2015 |

If you need a reason to join the Periscope craze, how about getting an inside look at the fastest supercomputer in the United States? Today staff at the Oak Ridge Leadership Computing Facility (OLCF), the DOE Office of Science user facility at Oak Ridge National Laboratory (ORNL) took the Periscope audience on a tour of Titan, Read more…

Researchers Model Birth of Universe in One of largest Cosmological Simulations Ever Run

Oct 30, 2015 |

Researchers are sifting through an avalanche of data produced by one of the largest cosmological simulations ever performed, led by scientists at the U.S. Department of Energy’s (DOE’s) Argonne National Laboratory. The simulation, run on the Titan supercomputer at DOE’s Oak Ridge National Laboratory, modeled the evolution of the universe from just 50 million years Read more…

Jack Dongarra et al. on Numerical Algorithms and Libraries at Exascale

Oct 19, 2015 |

The HPC software research community greeted this summer’s announcement of the President’s National Strategic Computing Initiative (NSCI) with tremendous enthusiasm. This response is easy to understand. More than twenty-five years have passed since a US administration last proposed such a coordinated, long term, multiagency strategy to improve the nation’s economic competitiveness and scientific research prowess Read more…

ORNL Demonstrates Road to Supercapacitors from Scrap Tires

Sep 25, 2015 |

Some of the 300 million tires discarded each year in the United States alone could be used in supercapacitors for vehicles and the electric grid using a technology developed at the Department of Energy’s Oak Ridge National Laboratory and Drexel University, according to an article posted at ORNL. By employing proprietary pretreatment and processing, a Read more…

Application Readiness at the DOE, Part I: Oak Ridge Advances Toward Summit

Apr 16, 2015 |

At the 56th HPC User Forum, hosted by IDC in Norfolk, Va., this week, three panelists from major government labs discussed how they are getting science applications ready for the coming crop of Department of Energy (DOE) supercomputers, which in addition to being five-to-seven times faster than today’s fastest big iron machines, constitute significant architectural changes. Titled “The Who-What-When of Getting Applications Ready to Read more…

Summit Puts 13 Code Projects Into Readiness Program

Apr 15, 2015 |

When the Oak Ridge National Laboratory’s Summit supercomputer powers up in 2018, it will provide the Department of Energy (DOE) research community with 150 to 300 peak petaflops of computational performance. To extract the highest benefit from this multi-million dollar machine that will be five to ten times the capability of the current fastest US Read more…

Health Care Catches Data Fever

Oct 30, 2014 |

The United States is arguably in the midst of a health care crisis, but there is hope on the horizon and it involves learning how to make sense of big data. Over at Communications of the ACM, Oak Ridge National Laboratory (ORNL) shares how it is helping the health care industry benefit from patient data using the power of Read more…