Since 1987 - Covering the Fastest Computers in the World and the People Who Run Them

Sectors » Life Sciences

Heading into ISC16, OpenHPC Releases Latest Stack with 60-plus Packages

Jun 16, 2016 |

SC15 was sort of a muted launch party for OpenHPC – the nascent effort to develop a ‘plug-and-play’ software framework for HPC. There seemed to be widespread agreement the idea had merit, not a lot of knowledge of details, and some wariness because Intel was a founding member and vocal advocate. Next week, ISC16 will mark the next milestone for OpenHPC, which has since grown into a full-fledged Linux Foundation Collaborative Project and today released version 1.0.1 of OpenHPC (build and test tools).

Intel Xeon E7 Balloons In-memory Capacity, Targets Real-Time Analytics

Jun 8, 2016 |

Who crunches more data faster, wins. It’s this drive that cuts through and clarifies the essence of the evolutionary spirit in the computer industry, the dual desire to get to real time with bigger and bigger chunks of data. The locomotive: HPC technologies adapted to enterprise mission-critical data analytics. With its memory capacity of up Read more…

HPC and Big Data Convergence Takes Life at PSC’s Bridges

Jun 1, 2016 |

Roughly three months into early operations, the Bridges computing resources being deployed at the Pittsburgh Supercomputing Center is bearing fruit. Designed to accommodate traditional HPC and big data analytics, Bridges had supported 245 projects as of May 26. This ramping up of the NSF-funded ($9.6 M) Bridges project is an important step in delivering practical Read more…

GPU-based Deep Learning Enhances Drug Discovery Says Startup

May 26, 2016 |

Sifting the avalanche of life sciences (LS) data for insight is an interesting and important challenge. Many approaches are used with varying success. Recently, improved hardware – primarily GPU-based – and better neural networking schemes are bringing deep learning to the fore. Two recent papers report the use of deep neural networks is superior to typical machine learning (support vector machine model) in sieving LS data for drug discovery and personalized medicine purposes.

TACC Wrangler Supports Zika Hackathon

May 19, 2016 |

By now most people have heard of Zika, the mosquito-borne disease that can cause fever and birth defects and that threatens to spread to the United States from Latin and South America. Earlier this week more than 50 data scientists, engineers, and University of Texas students gathered for the “Austin Zika Hackathon” at big data Read more…

Machine Learning Advances Fight Against Cancer

May 18, 2016 |

Developing effective tools against cancer has been a long, complicated endeavor with successes and disappointments. Despite all, cancer remains the leading cause of death worldwide. Now, machine learning and data analytics are being recruited as tools in the effort fight the disease and show significant promise according to two recent papers. In one paper – An Analytics Approach to Designing Combination Chemotherapy Regimens for Cancer – researchers from MIT and Stanford “propose models that use machine learning and optimization to suggest regimens to be tested in phase II and phase III trials.”

Edico Genome to Make DRAGEN Available on New IBM Power Systems for HPC

May 12, 2016 |

SAN DIEGO, Calif., May 12 — Edico Genome, creator of the world’s first bio-IT processor designed to analyze next-generation sequencing data, today announced a collaboration with IBM to make DRAGEN available on the new IBM Power Systems S822LC for high performance computing (HPC). In addition, Edico is now offering the Broad Institute’s Genome Analysis Toolkit (GATK), a software package for analysis of Read more…

Barcelona Supercomputing Center Develops New Bioinformatics Tool Against HIV

May 11, 2016 |

Viruses’ natural mutational agility has long been problematic for established therapies. Determining a therapeutic compound’s effectiveness against a mutated viral pathogen mostly entails empirical screening of the mutated virus with compounds to gauge effectiveness. This week researchers from the Barcelona Supercomputing Center and IrsiCaixa, the Catalan AIDS Research Institute, reported developing a bioinformatics method to Read more…

TGAC Installs largest SGI UV 300 Supercomputer for Life Sciences

May 11, 2016 |

Two weeks ago, The Genome Analysis Centre (TGAC) based in the U.K. turned on the first of two new SGI UV300 computers. Next week, or thereabouts, TGAC will bring a second identical system online. Combined with its existing SGI UV2000, TGAC will have the largest SGI system dedicated to life sciences in the world. The upgrade will allow TGAC to significantly shorten the time required to assemble wheat genomes, a core activity in TGAC efforts to enhance worldwide food security.

Technology Test Drive: PNNL Offers Exploratory Licenses

May 10, 2016 |

Signing a two-page agreement and paying just $1,000 can get U.S. companies an opportunity to test drive promising technologies through a new, user-friendly commercialization option being offered at the Department of Energy’s Pacific Northwest National Laboratory. PNNL is the only DOE lab to offer this option, called an exploratory license, which gives companies six months Read more…

Healthcare Professionals Get Cognitive Sooner with Watson Health Financed by IBM

May 9, 2016 |

Perhaps more than any other, the healthcare industry is undergoing dramatic upheaval. Changes in reimbursement models, continued merger and acquisition activity, regulatory requirements, and the increased focus on quality of patient care are forcing many hospitals and healthcare providers to reinvent themselves.   Patients and customers are driving change, and healthcare providers struggle to follow their Read more…

ITIF Report Aims to Sway Congress, Promote National HPC Agenda

Apr 28, 2016 |

The Information Technology and Innovation Foundation (ITIF), a Washington D.C. think tank with close ties to the Office of Science and Technology Policy and government broadly, today released an expansive report – The Vital Importance of High- Performance Computing to U.S. Competitiveness – and also held a panel to discuss the report’s recommendation. Noteworthy, many of the panelists are familiar names in the HPC community.

IBM Expands All-Flash Storage; Takes Aim at Cognitive Computing and Cloud

Apr 27, 2016 |

Calling it the foundation of its cloud and cognitive computing strategy, IBM has significantly expanded its all-flash storage platform portfolio with the announcement today of three new array products aimed at managing massive amounts of data associated with cloud-based, high-performance workloads. Targeting hyperscale, cloud data centers and cloud service providers (CSP), the systems utilize IBM’s Read more…

France to Boost Industrial Innovation with New Petascale Supercomputer

Apr 20, 2016 |

This summer, the French Alternative Energies and Atomic Energy Commission (CEA) and 13 leading French industrial companies will boost their research capabilities when they take possession of a new shared supercomputer called COBALT, designed…

Titan Helps Shed Light on Membrane Lipids’ Multiple Roles

Apr 4, 2016 |

Lipid molecules are schizophrenic. One end likes to hang out with a charged crowd (think water); the other prefers neutral neighbors (think fats). Most of us remember those funky illustrations of the bilayer lipid membrane structure that encloses animal cells from high school biology. Recently, Titan supercomputer was used to show cell membranes may be Read more…

EnterpriseHPC Summit: How and Why HPC Is Burgeoning in the Commercial Sphere

Mar 28, 2016 |

The annual EnterpriseHPC Summit, produced by EnterpriseTech and HPCwire and held in San Diego this week, featured presentations and participation from some of the major thought leaders at the forefront of bringing advanced scale computing into commercial environments. With delegates from leading vendors (Dell, Intel, DDN and EMC, among others) and from end user organizations such Read more…

Making Sense of HPC in the Age of Democratization

Mar 8, 2016 |

These are exciting times for HPC. High-performance computing and its cousin high-productivity computing are expanding such that the previous definitions of HPC as a moving performance target or as the purview of modeling and simulation are breaking down. The democratization of HPC has spurred a lot of focus on the impact that HPC-hatched technologies are Read more…

‘Biomolecular Motor-based’ Computer Promises Speed and Reduced Power

Mar 2, 2016 |

Combinatorial tasks are among the hardest for traditional computers. A good example is finding the optimum path through a large complicated network. Every possible path must be evaluated and as datasets grow the computing time grows exponentially making some tasks unfeasible. One practical example is verification of VLSI (very large scale integrated) semiconductor circuit design. Read more…

Registration Now Open for the NCSA 2016 Private Sector Partner Annual Meeting

Feb 24, 2016 |

Feb. 24 — NCSA’s Private Sector Program brings together private, university, and government organizations with the shared goal of increasing competitiveness and economic impact through high-performance computing and data analytics. When:  May 3-5, 2016 (Tuesday – Thursday) Where: University of Illinois at Urbana-Champaign, NCSA Building, 1205 W. Clark Street, Urbana IL, 61801 Agenda Summary May 2, Monday evening: PSP partner-only dinner (by specific Read more…

Final Obama Budget Lauds Innovation, Unlocks NSCI Funding

Feb 10, 2016 |

On Tuesday, President Obama released his fiscal year 2017 budget, the final budget of his administration. Broadly focused on America’s economic prosperity and national security goals, the $4.1 trillion spending plan for fiscal year 2017 emphasizes a great number of grand challenges that require robust investments in research and development (R&D); innovation; and STEM education. The science, Read more…