Topics » Applications

Accelerate Drug Discovery with Machine Learning on Big Medical Data

Sep 24, 2015 |

Pharmaceutical companies spend billions testing prospective drugs by conducting “wet lab” experiments that can take years to complete. But what if the same results could be obtained in a matter of minutes by running computer model simulations instead? A Silicon Valley startup says it has created a novel machine learning algorithm that does just that. Read more…

AI’s Forward March: Machine Teaches Itself to Play Chess in 72 Hours

Sep 23, 2015 |

The field of artificial intelligence has had a rocky history with numerous setbacks, but there have been high points too, like when IBM’s Deep Blue beat reigning chess champion, Garry Kasparov, in 1997, or when another IBM machine, Watson, proved its mettle on the popular quiz show Jeopardy, in 2011. Now machine learning, and its Read more…

ALCF’s Paul Messina on the Code Optimization Path to Exascale

Sep 14, 2015 |

To borrow a phrase from paleontology, the HPC community has historically evolved in punctuated equilibrium. In the 1970s we transitioned from serial to vector architectures. In the 1980s parallel architectures blossomed, and in the 1990s MIMD systems became the norm for most supercomputer architectures. From the 1990s until today we have been in a period Read more…

U of Michigan Project Combines Modeling and Machine Learning

Sep 10, 2015 |

Although we’ve yet to settle on a term for it, the convergence of HPC and a new generation of big data technologies is set to transform science. The compute-plus-data mantra reaches all the way to the White House with President Obama’s National Strategic Computing Initiative calling for useful exascale computing and sophisticated data capabilities that serve the Read more…

Cycle Computing Orchestrates Cancer Research on Google Cloud

Sep 10, 2015 |

This week HPC cloud software specialist Cycle Computing announced that its full suite of products can now be used to spin up clusters on Google’s cloud platform. As testament to the new partnership, Cycle leveraged Google Compute Engine (GCE) to run a 50,000-core cancer gene analysis workload for the Broad Institute. As Cycle Computing explains, Broad’s Cancer Group approached them Read more…

Knowm Snaps in Final Piece of Memristor Puzzle

Sep 9, 2015 |

Adaptive computing company Knowm Inc. says it has cleared a major technological hurdle in the pursuit of practical artificial neural network (ANN) chips that has stymied deeper-pocketed competitors, including IBM and HP. As the first to develop memristors capable of bi-directional incremental learning capability, the Santa Fe, NM-based startup can move forward with its ANN chip architecture, the intended substrate Read more…

The Era of Personalized Medicine: An Interview with Peter Coveney

Sep 3, 2015 |

What opportunities and challenges come with personalized medicine? What amount of computing is necessary to get meaningful results? Are we able to handle substantial quantities of patient data and to use them to perform predictive, mechanistic modelling and simulation? Based on this, will we be able to deliver therapies and to enhance clinical decision making. Read more…

Intel Haswell-EX Server Sets STAC-A2 Performance Record

Sep 2, 2015 |

Intel has reasserted its prominence on a subset of financial benchmarks designed to evaluate platforms for the pricing and market risk analytics. More powerful Xeons — “Haswell-EX” E7-8890 v3 processors — combined with changes to the software stack enabled Intel to set a new speed record on the STAC-A2 benchmark for both warm and cold Read more…

Exploring Large Data for Scientific Discovery

Aug 27, 2015 |

A curse of dealing with mounds of data so massive that they require special tools, said computer scientist Valerio Pascucci, is if you look for something, you will probably find it, thus injecting bias into the analysis.

Argonne Team Tackles Uncertainties in Engine Simulation

Aug 27, 2015 |

As we head deeper into the digital age, computers appropriate an ever greater share of the work of designing and testing physical systems, spanning the gamut from nuclear components to personal care products. Engine design is another important, but very complex, application that is benefiting from computational modeling. To optimize the performance of combustion engines, chemical models are incorporated into computer simulations. Read more…

COSMOS Team Achieves 100x Speedup on Cosmology Code

Aug 24, 2015 |

One of the most popular sessions at the Intel Developer Forum last week in San Francisco, and certainly one of the most exciting from an HPC perspective, brought together two of the world’s foremost experts in parallel programming to discuss current state-of-the-art methods for leveraging parallelism on processors and coprocessors. The speakers, Intel’s Jim Jeffers and Read more…

XSEDE Resources Fuel Investigation of Model Human Origins

Aug 20, 2015 |

A mix of data- and computation-intensive XSEDE resources are enabling researchers to scale up their climate, vegetation and agent-based human behavior models to tackle fundamental questions of how Homo sapiens came to dominate the planet, according to archeologist Colin Wren of Arizona State University (ASU). The international collaboration, led by Curtis Marean of ASU, will Read more…

Amazon Web Services Spotlights HPC Options

Aug 11, 2015 |

Despite high initial interest, HPC in the cloud never achieved significant adoption levels, mainly being relegated to low-hanging “pleasingly parallel” fruit and testing or experimental use, however, there are signs of growth on the horizon, according to HPC analysts. “As public clouds acquire stronger HPC capabilities and perform better on a larger set of HPC workloads, they Read more…

NERSC’s ‘Shifter’ Makes Container-based HPC a Breeze

Aug 7, 2015 |

The explosive growth in data coming out of experiments in cosmology, particle physics, bioinformatics and nuclear physics is pushing computational scientists to design novel software tools that will help users better access, manage and analyze that data on current and next-generation HPC architectures. One increasingly popular approach is container-based computing, designed to support flexible, scalable Read more…

Reading List: Fault Tolerance Techniques for HPC

Aug 6, 2015 |

Among the chief challenges of deploying useful exascale machines, resilience looms large. Today’s error rates combined with tomorrow’s node counts cannot sustain a productive workflow without intervention. The significance of this issue has not gone unnoticed. A comprehensive collection of fault tolerance techniques are presented in one volume, called “Fault Tolerance Techniques for High-Performance Computing,” by editors Thomas Herault and Yves Read more…

New National HPC Strategy Is Bold, Important and More Daunting than US Moonshot

Aug 6, 2015 |

“In order to maximize the benefits of HPC for economic competitiveness and scientific discovery, the United States Government must create a coordinated Federal strategy in HPC research, development, and deployment.” With these words, the President of the United States established the National Strategic Computing Initiative (NSCI) through Executive Order to implement this whole-of-government strategy in Read more…

Japan Takes Top Three Spots on Green500 List

Aug 4, 2015 |

Japan, the island nation renowned for its energy and space-saving design prowess, just nabbed the top three spots on the latest Green500 list and claimed eight of the 20 top spots. The Shoubu supercomputer from RIKEN took top honors on the 17th edition of the twice-yearly listing, becoming the first TOP500-level supercomputer to surpass the Read more…

Script for Bioinformatics App Nixes HPC File-System Tangle

Jul 31, 2015 |

Staff in NSF’s XSEDE network have created a script that avoids file-system tangles seen when scaling some common scientific applications for use on HPC systems, according to Antonio Gomez Iglesias of the Texas Advanced Computing Center (TACC). The fix can be useful to many users employing similar applications, he said Tuesday in a presentation at Read more…

XSEDE Panel Highlights Diversity of NSF Computing Resources

Jul 31, 2015 |

A plenary panel at the XSEDE15 conference, which took place this week in St. Louis, Mo., highlighted the broad spectrum of computing resources provided by the National Science Foundation, including several new and testbed projects and an effort to help more people use cyberinfrastructure to advance their research. “I don’t think there has been a Read more…

White House Launches National HPC Strategy

Jul 30, 2015 |

Yesterday’s executive order by President Barack Obama creating a National Strategic Computing Initiative (NSCI) is not only powerful acknowledgment of the vital role HPC plays in modern society but is also indicative of government’s mounting worry that failure to coordinate and nourish HPC development on a broader scale would put the nation at risk. Not Read more…