What You Missed at Last Week’s Leverage Big Data Event

By Nicole Hemsoth

May 27, 2014

If there’s one term that finds its way into nearly every conversation we have with vendors and end users on both the research and commercial spheres, it is leverage. Although it’s general, it’s being applied to everything from data, systems, software and of course, talent.

Accordingly, our inaugural event, which was held at the Park Hyatt Aviara throughout last week revolved around this one word—and all that it implies. We purposefully chucked the standard fare of “what is big data” and further, “what does big data mean to HPC” to explore in detail just how emerging and existing technologies at the top tier of computing are offering leverage to researchers and enterprise end users.

It’s amazing what happens when one kicks off a conference with a general rule to “avoid the general”. As soon as the hand-picked group was on the same page that we wouldn’t bother with questions around definitions or delineations, the real work began. And what emerged, both organically and due to subtle direction, was the theme that the cross-industry challenges of ever more complex, harder to move and manage data offered far more in the way of a common platform for discussion than one might expect. We purposefully mixed our audience and speakers—from life sciences, manufacturing, research computing, financial services, government and beyond—to dig into how the challenges around leveraging data in new, creative, and more productive ways was a common goal, and how similar tools and approaches could be adopted across wider industry contexts than any of us might have originally thought.

Dr. Jack Collins at the opening keynote.
Dr. Jack Collins at the opening keynote.

We had a number of notable special guests, speakers, and panel participants that offered diverse perspectives to tie together the themes, which meant that in exploring the cross-industry hook that pulled everything together, new topics emerged. Dr. Jack Collins, Director of the Advanced Biomedical Computing Center at the National Cancer Institute, revealed how new sources of complex data are pushing his team closer to cures at his home institution, but creating new challenges along the way, particularly in terms of finding adequate talent.

While his group’s mission is sweeping—a definite set of grand science challenges—the primary barrier at the National Cancer Institute isn’t caused by compute, storage, or network challenges, it’s fed by a lack of critical thinkers, problem solvers, and those with a desire for the type of lifelong learning and passion for creative approaches to analysis. While Collins did touch on how some of the hardware and software challenges impact his team’s ability to forge ahead, the top problem is quite simply people.

IMG_4707Interestingly, this “people problem” inadvertently became the subject of the second half of a panel session which included Dr. Ari Berman of BioTeam; Merle Giles, who directs the Economic Impact and Private Sector Programs division at the National Center for Supercomputing Applications; Steve Yatko, former head of worldwide IT R&D at Credit Suisse and CEO of Oktay Technologies, Jack Levis, Senior Director of Process Management at UPS, and Chetan Gadgil, of GE’s Software and Analytics division. Each of the panelists shared how their progress towards data-driven goals was hindered by a lack of people with more than just pure coding skills. There appears to be a shortage of folks who have the skills and the mindset—a much more nebulous thing to define—to extract real meaning out of their data and hardware/software tools. The general consensus of the group, voiced best by Jack Levis of UPS, was that “big data” as a concept doesn’t matter. It’s all just data, the problem revolves around these side issues of getting the tooling and people in place.

IMG_4572Dr. Kirk Borne, Professor of Astrophysics and Computational Scientist at George Mason University “From Astrophysics to Airlines” had similar statements on the lack of talent all around, but the takeaway from his session, which had everyone’s full attention given the scope, was about possibility. Borne showed the direct scientific and commercial benefits of large-scale data analysis through telescope and astrophysics examples, before broadening out to share how these same approaches have direct translations into manufacturing, transportation and beyond.

IMG_4743On the tools front, Datanami editor Alex Woodie asked some tough questions of San Diego Supercomputer Center researchers, Dr. Glenn Lockwood and Predictive Analytics Center of Excellent head Dr. Natasha Balac. While the hype versus reality of Hadoop was at the heart of the panel (by the way, showing that Hadoop has a long way to go to live up to its promises in real-world large-scale settings), the duo set the groundwork for the commercial computing case studies and sessions that followed. In other words, the event steered clear of the “Hadoop will save the world” mentality that so often eclipses actual use cases—of which there are only a relative handful in large-scale full-bore production.

IMG_5021Presentations from others, including SAP’s Colin Dover hit on similar points, as did the number of case study presentations from SAP, Adaptive Computing, SGI, Bright Computing, IBM, Quanta, Penguin, and DDN offered great value for attendees, many of them end users in search of ways to meet the data deluge.

elegantTake a look at more photos from the event https://www.facebook.com/media/set/?set=a.592789734161420.1073741828.124760547631010&type=3&uploaded=353 — we’ll see you in September as we roll into the topic of Enterprise HPC. Very exciting…

Thanks again to our many sponsors and nGage events for making this such a success!

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

GTC21: Dell Building Cloud Native Supercomputers at U Cambridge and Durham

April 14, 2021

In conjunction with GTC21, Dell Technologies today announced new supercomputers at universities across DiRAC (Distributed Research utilizing Advanced Computing) in the UK with plans to explore use of Nvidia BlueField DPU Read more…

The Role and Potential of CPUs in Deep Learning

April 14, 2021

Deep learning (DL) applications have unique architectural characteristics and efficiency requirements. Hence, the choice of computing system has a profound impact on how large a piece of the DL pie a user can finally enj Read more…

GTC21: Nvidia Launches cuQuantum; Dips a Toe in Quantum Computing

April 13, 2021

Yesterday Nvidia officially dipped a toe into quantum computing with the launch of cuQuantum SDK, a development platform for simulating quantum circuits on GPU-accelerated systems. As Nvidia CEO Jensen Huang emphasized i Read more…

Nvidia Aims Clara Healthcare at Drug Discovery, Imaging via DGX

April 12, 2021

Nvidia Corp. continues to expand its Clara healthcare platform with the addition of computational drug discovery and medical imaging tools based on its DGX A100 platform, related InfiniBand networking and its AGX develop Read more…

Nvidia Serves Up Its First Arm Datacenter CPU ‘Grace’ During Kitchen Keynote

April 12, 2021

Today at Nvidia’s annual spring GPU technology conference, held virtually once more due to the ongoing pandemic, the company announced its first ever Arm-based CPU, called Grace in honor of the famous American programmer Grace Hopper. Read more…

AWS Solution Channel

Volkswagen Passenger Cars Uses NICE DCV for High-Performance 3D Remote Visualization

 

Volkswagen Passenger Cars has been one of the world’s largest car manufacturers for over 70 years. The company delivers more than 6 million automobiles to global customers every year, from 50 production locations on five continents. Read more…

Nvidia Debuts BlueField-3 – Its Next DPU with Big Plans for an Expanded Role

April 12, 2021

Nvidia today announced its next generation data processing unit (DPU) – BlueField-3 – adding more substance to its evolving concept of the DPU as a full-fledged partner to CPUs and GPUs in delivering advanced computi Read more…

GTC21: Dell Building Cloud Native Supercomputers at U Cambridge and Durham

April 14, 2021

In conjunction with GTC21, Dell Technologies today announced new supercomputers at universities across DiRAC (Distributed Research utilizing Advanced Computing) Read more…

The Role and Potential of CPUs in Deep Learning

April 14, 2021

Deep learning (DL) applications have unique architectural characteristics and efficiency requirements. Hence, the choice of computing system has a profound impa Read more…

Nvidia Serves Up Its First Arm Datacenter CPU ‘Grace’ During Kitchen Keynote

April 12, 2021

Today at Nvidia’s annual spring GPU technology conference, held virtually once more due to the ongoing pandemic, the company announced its first ever Arm-based CPU, called Grace in honor of the famous American programmer Grace Hopper. Read more…

Nvidia Debuts BlueField-3 – Its Next DPU with Big Plans for an Expanded Role

April 12, 2021

Nvidia today announced its next generation data processing unit (DPU) – BlueField-3 – adding more substance to its evolving concept of the DPU as a full-fle Read more…

Nvidia’s Newly DPU-Enabled SuperPod Is a Multi-Tenant, Cloud-Native Supercomputer

April 12, 2021

At GTC 2021, Nvidia has announced an upgraded iteration of its DGX SuperPods, calling the new offering “the first cloud-native, multi-tenant supercomputer.” Read more…

Tune in to Watch Nvidia’s GTC21 Keynote with Jensen Huang – Recording Now Available

April 12, 2021

Join HPCwire right here on Monday, April 12, at 8:30 am PT to see the Nvidia GTC21 keynote from Nvidia’s CEO, Jensen Huang, livestreamed in its entirety. Hosted by HPCwire, you can click to join the Huang keynote on our livestream to hear Nvidia’s expected news and... Read more…

The US Places Seven Additional Chinese Supercomputing Entities on Blacklist

April 8, 2021

As tensions between the U.S. and China continue to simmer, the U.S. government today added seven Chinese supercomputing entities to an economic blacklist. The U Read more…

Habana’s AI Silicon Comes to San Diego Supercomputer Center

April 8, 2021

Habana Labs, an Intel-owned AI company, has partnered with server maker Supermicro to provide high-performance, high-efficiency AI computing in the form of new Read more…

Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?

January 13, 2021

The rapid adoption of Julia, the open source, high level programing language with roots at MIT, shows no sign of slowing according to data from Julialang.org. I Read more…

Intel Launches 10nm ‘Ice Lake’ Datacenter CPU with Up to 40 Cores

April 6, 2021

The wait is over. Today Intel officially launched its 10nm datacenter CPU, the third-generation Intel Xeon Scalable processor, codenamed Ice Lake. With up to 40 Read more…

CERN Is Betting Big on Exascale

April 1, 2021

The European Organization for Nuclear Research (CERN) involves 23 countries, 15,000 researchers, billions of dollars a year, and the biggest machine in the worl Read more…

Programming the Soon-to-Be World’s Fastest Supercomputer, Frontier

January 5, 2021

What’s it like designing an app for the world’s fastest supercomputer, set to come online in the United States in 2021? The University of Delaware’s Sunita Chandrasekaran is leading an elite international team in just that task. Chandrasekaran, assistant professor of computer and information sciences, recently was named... Read more…

HPE Launches Storage Line Loaded with IBM’s Spectrum Scale File System

April 6, 2021

HPE today launched a new family of storage solutions bundled with IBM’s Spectrum Scale Erasure Code Edition parallel file system (description below) and featu Read more…

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

Saudi Aramco Unveils Dammam 7, Its New Top Ten Supercomputer

January 21, 2021

By revenue, oil and gas giant Saudi Aramco is one of the largest companies in the world, and it has historically employed commensurate amounts of supercomputing Read more…

Quantum Computer Start-up IonQ Plans IPO via SPAC

March 8, 2021

IonQ, a Maryland-based quantum computing start-up working with ion trap technology, plans to go public via a Special Purpose Acquisition Company (SPAC) merger a Read more…

Leading Solution Providers

Contributors

Can Deep Learning Replace Numerical Weather Prediction?

March 3, 2021

Numerical weather prediction (NWP) is a mainstay of supercomputing. Some of the first applications of the first supercomputers dealt with climate modeling, and Read more…

Livermore’s El Capitan Supercomputer to Debut HPE ‘Rabbit’ Near Node Local Storage

February 18, 2021

A near node local storage innovation called Rabbit factored heavily into Lawrence Livermore National Laboratory’s decision to select Cray’s proposal for its CORAL-2 machine, the lab’s first exascale-class supercomputer, El Capitan. Details of this new storage technology were revealed... Read more…

New Deep Learning Algorithm Solves Rubik’s Cube

July 25, 2018

Solving (and attempting to solve) Rubik’s Cube has delighted millions of puzzle lovers since 1974 when the cube was invented by Hungarian sculptor and archite Read more…

African Supercomputing Center Inaugurates ‘Toubkal,’ Most Powerful Supercomputer on the Continent

February 25, 2021

Historically, Africa hasn’t exactly been synonymous with supercomputing. There are only a handful of supercomputers on the continent, with few ranking on the Read more…

The History of Supercomputing vs. COVID-19

March 9, 2021

The COVID-19 pandemic poses a greater challenge to the high-performance computing community than any before. HPCwire's coverage of the supercomputing response t Read more…

AMD Launches Epyc ‘Milan’ with 19 SKUs for HPC, Enterprise and Hyperscale

March 15, 2021

At a virtual launch event held today (Monday), AMD revealed its third-generation Epyc “Milan” CPU lineup: a set of 19 SKUs -- including the flagship 64-core, 280-watt 7763 part --  aimed at HPC, enterprise and cloud workloads. Notably, the third-gen Epyc Milan chips achieve 19 percent... Read more…

HPE Names Justin Hotard New HPC Chief as Pete Ungaro Departs

March 2, 2021

HPE CEO Antonio Neri announced today (March 2, 2021) the appointment of Justin Hotard as general manager of HPC, mission critical solutions and labs, effective Read more…

Microsoft, HPE Bringing AI, Edge, Cloud to Earth Orbit in Preparation for Mars Missions

February 12, 2021

The International Space Station will soon get a delivery of powerful AI, edge and cloud computing tools from HPE and Microsoft Azure to expand technology experi Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire