Today's Top Feature

Data-Hungry Algorithms and the Thirst for AI

At Tabor Communications’ Leverage Big Data + EnterpriseHPC Summit in Florida last week, esteemed

By Tiffany Trader

Center Stage

Bill Gropp – Pursuing the Next Big Thing at NCSA

About eight months ago Bill Gropp was elevated to acting director of the National

By John Russell

HPC Compiler Company PathScale Seeks Life Raft

HPCwire has learned that HPC compiler company PathScale has fallen on difficult

By Tiffany Trader

Quantum Bits: D-Wave and VW; Google Quantum Lab; IBM Expands Access

For a technology that’s usually characterized as far off and in a distant galaxy,

By John Russell

People to Watch 2017

With 2017 underway, we’re looking to the future of high performance computing and the milestones that are growing ever closer.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Researchers Recreate ‘El Reno’ Tornado on Blue Waters Supercomputer

March 16, 2017

The United States experiences more tornadoes than any other country. About 1,200 tornadoes touch down each each year in the U.S. Read more…

By Tiffany Trader

CPU-based Visualization Positions for Exascale Supercomputing

March 16, 2017

In this contributed perspective piece, Intel’s Jim Jeffers makes the case that CPU-based visualization is now widely adopted and as such is no longer a contrarian view, but is rather an exascale requirement. Read more…

By Jim Jeffers, Principal Engineer and Engineering Leader, Intel

New File System from PSC Tackles Image Processing on the Fly

July 25, 2016

Processing the high-volume datasets, particularly image data, generated by modern scientific instruments is a huge challenge. Read more…

By John Russell

Inside the Fire: TACC Image of Rapidly Spinning Star

December 11, 2015

A computer generated image of visualized variables from a star simulation dataset generated with Anelastic Spherical Harmonic code on the Ranger supercomputer at the Texas Advanced Computing Center at the Univeristy of Texas at Austin. Read more…

Contrary View: CPUs Sometimes Best for Big Data Visualization

December 1, 2015

Contrary to conventional thinking, GPUs are often not the best vehicles for big data visualization. Read more…

By Jim Jeffers, Intel

Big Data Reveals Glorious Animation of Antarctic Bottom Water

November 30, 2015

A remarkably detailed animation of the movement of the densest and coldest water in the world around Antarctica has been produced using data generated on Australia’s most powerful supercomputer, Raijin. Read more…

NSF-Funded CADENS Project Seeking Data and Visualizations

November 24, 2015

The NSF-funded Centrality of Advanced Digitally ENabled Science (CADENS) project is looking for scientific data to visualize or existing data visualizations to weave into larger documentary narratives in a series of fulldome digital films and TV programs aimed at broad public audiences. Read more…

Mira is First Supercomputer to Simulate Large Hadron Collider Experiments

November 4, 2015

Argonne physicists are using Mira to perform simulations of Large Hadron Collider (LHC) experiments with a leadership-class supercomputer for the first time, shedding light on a path forward for interpreting future LHC data. Read more…

By Jim Collins

Leading Solution Providers

  • arrow
  • Click Here for More Headlines
  • arrow

Whitepaper:

Sorting Fact from Fiction: HPC-enabled Engineering Simulations, On-premises or in the Cloud

HPC may once have been the sole province for huge corporations and national labs, but with hardware and cloud resources becoming more affordable even small and mid-sized companies are taking advantage.

Download this Report

Sponsored by ANSYS

Webinar:

Enabling Open Source High Performance Workloads with Red Hat

High performance workloads, big data, and analytics are increasingly important in finding real value in today's applications and data. Before we deploy applications and mine data for mission and business insights, we need a high-performance, rapidly scalable, resilient infrastructure foundation that can accurately, securely, and quickly access data from all relevant sources. Red Hat has technology that allows high performance workloads with a scale-out foundation that integrates multiple data sources and can transition workloads across on-premise and cloud boundaries.

Register to attend this LIVE webinar

Sponsored by Red Hat

Advanced Scale Career Development & Workforce Enhancement Center

Featured Advanced Scale Jobs:

Receive the Monthly
Advanced Computing Job Bank Resource:

HPCwire Product Showcase

Subscribe to the Monthly
Technology Product Showcase:

Subscribe