Two years ago the Department of Energy established the Center for Advanced Technology Evaluation (CENATE) at Pacific Northwest National Laboratory (PNNL). CENATE’s ambitious mission was to be a …
Deep learning is a powerful tool that identifies patterns, extracts meaning from large, diverse datasets, and solves complex problems. However, integrating neural networks into existing compute …
Sponsored Content by Intel
Cardiac arrhythmia can be an undesirable and potentially lethal side effect of drugs. During this condition, the electrical activity of the heart turns chaotic, decimating its pumping function, t …
The Department of Energy’s Exascale Computing Project (ECP) has named Doug Kothe as its new director effective October 1. He replaces Paul Messina, who is stepping down after two years to ret …
Polls are open! Cast your ballot for 2017's best in HPC today!
Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!
Sorry, but nothing matches what you're looking for. Please try again with some different keywords.
As genomic data becomes ubiquitous, infrastructure bottlenecks for life sciences organizations are narrowing. But speedy analysis and real-time decision making don't have to remain out of reach: modern end-to-end systems are emerging as flexible solutions for a competitive edge.
As data processing grows more and more specialized, effective storage strategies are more important than ever. And for IT professionals reevaluating their storage needs, software-defined and object-based storage are gaining ground by automating storage management – a trend that is only set to continue.
High performance workloads, big data, and analytics are increasingly important in finding real value in today's applications and data. Before we deploy applications and mine data for mission and business insights, we need a high-performance, rapidly scalable, resilient infrastructure foundation that can accurately, securely, and quickly access data from all relevant sources. Red Hat has technology that allows high performance workloads with a scale-out foundation that integrates multiple data sources and can transition workloads across on-premise and cloud boundaries.
HPC may once have been the sole province for huge corporations and national labs, but with hardware and cloud resources becoming more affordable even small and mid-sized companies are taking advantage.
© HPCwire. All Rights Reserved. A Tabor Communications Publication
Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications, Inc. is prohibited.