At SC19: Who’s Accountable ‘When Technology Kills’?

By Doug Black

December 2, 2019

“One of the things we are witnessing is the compute requirement for (AI) training jobs is doubling every three-and-a-half months. So we were very impressed with Moore’s Law doubling every 18 months, right? This thing is doubling every three and a half months. Obviously, it’s unsustainable. If we keep at that rate for sustained periods of time, we will consume every piece of energy the world has just to do this.”

– Dario Gil, director, IBM Research

The relationship between humans and machines is inarguably one of the great issues of this century. Machine intelligence is rising at supersonic speed (see above); human intelligence is, at best, rising slowly. Machines, though bereft of EQ, can apply unimaginably, inhumanly high (and getting higher) IQ to its assigned tasks; human EQ can be nurtured, but our ability to comprehend and synthesize large amounts complex information has limits. Machines are capable of handling larger spheres of our work and personal lives; people are ceding more personal and work tasks to machines. These trend lines are firmly in place and – as things stand now – will only accelerate.

“When Technology Kills,” the conference plenary at last week’s SC19 in Denver, comprised of three technology intellectuals, took up thorny AI-related questions around how much autonomy ML, AI, IoT and smart data should be given, regulations for governing software development and data privacy, additional training people should receive to manage new software systems, and ultimately, who is responsible when software fails, violates our legal rights, causes property damage, injury or loss of life?

One would assume these questions weigh heavily on the minds of the HPC-AI community, many of whose most capable and influential technologists were gathered at SC19. But the plenary session attracted a relatively light turnout – tending to confirm the perception that the AI industry is far more focused on developing systems with superhuman capabilities than on the social and ethical implications those systems pose.

To be fair, technologists above all are tasked with building powerful technology that disrupts and drives their organizations to first mover status. It’s also true that philosophical discussions of AI ethics can be somewhat ethereal, light on actionable insight. But a growing chorus of voices are being raised in industry analyst, academic, political and journalistic circles about controlling the ways in which humans and machines will co-exist in the decades to come. Their warning to the technology industry: Avoid issues of AI ethics and accountability at your own peril.

In fact, the common current running through the plenary discussion centered more on people than machines, and on the need to balance the legitimate though often conflicting interests of technology’s constituent groups.

SC19 Plenary panel, from left: Keri Savoca, freelance technical writer; Eric Hunter, Bradford & Barthel; Erin Kenneally, Elchemy; Ben Rothke, Tapad

As Eric Hunter, futurist and director of knowledge, innovation and technology strategies at consulting firm Bradford & Barthel LLP, said, “Oftentimes, people will focus on the technology itself. (But) it really comes down to the individuals involved, the humans involved. That can sound profoundly redundant, but I’m saying that because you can’t divorce human behavior and technology. And it’s about the individuals that are adapting to these technologies, creating them, interacting with them. And then when something fails, what was the decisions that were made? And what can be learned from the individuals involved?”

Early in the plenary, Erin Kenneally, CEO of advisory group Elchemy, offered a coda for the session:

“I think, when you use the terms responsibility and blame, implicit in that is this notion of lack of trust. And I think one of the major problems that we’re facing now is this gap between our technology and our laws. I (call it) a gap between our expectations and our capabilities… Imagine a graph and it’s… got two lines and the upper line is very steeply sloping, approximating Moore’s Law. And that represents the rate of change of technology capabilities… At the bottom, you’ve got, not quite flatlining but slightly up-slope is a line that represents our laws, and that is our expectations. So there’s this gap … and this is where we have conflicts of rights and interests.“Let’s take deploying AI edge devices to monitor your behavior. The issue is my rights. My rights and my interest in privacy may conflict with my employer’s right and interest in the security of their enterprise or their commercial free speech rights. And that may conflict with my fellow citizens’ interest in their own security and privacy, which may conflict with the government’s interests in securing critical infrastructure. We see these instances all over the place.

“Like I mentioned with AI, we’ve got recommender systems, scoring systems, classification systems that are all trying to build predictive models about us. At the end of the day, I think it’s important to realize that technology is no longer just providing affordances. It’s not just spell checking our documents. It’s actually making decisions and taking actions by and for and with us, and those impact our rights and interests. And oftentimes, those decisions and those actions are being done in a very asymmetrical, opaque manner. And they have impact and we’re not certain of those impacts, sometimes. So we’re dealing with this widening delta between our capabilities and our expectations. And I think that’s what we have to worry about.

“I think there are at least three consequences…, you get increasing tensions between legitimate stakeholders. It’s easy to say good guy versus bad guy we know someone’s right and someone’s in the wrong, but when you’ve got good guy versus good guy versus good guy, how do you resolve those issues? You’ve also got an inefficient avoidance of risk problems as well if organizations don’t know if what they’re doing is violating the law. And then finally, you’ve got an undermining of ordering forces, people either receding from the marketplace and not trusting technology, or going rogue and extrajudicial and taking the law into their own hands. And we don’t, we don’t want any of those situations to happen.

“There are at least five ‘trust mechanisms’ that we can rely on.

“Number one, we need incentives to build secure software and not just race to time-to-market pressures. We need to rely on responsible research and development. I think that’s critical.

“Secondly, we need to do a better job of getting real-world, longitudinal, large scale data in front of researchers and developers to test and evaluate their technologies. …You can build the greatest algorithm since sliced cheese but if you don’t have good data going in you’re going to get bad results.

“We need to do a better job from a smart governance perspective with regard to the collection, use and disclosure of data, and we need to be more innovative from a legal perspective with regard to liability and holding people responsible.

“We need to have better convening and coordinating between academia, research and the private sector around standards and best practices.

And then finally, I think ethics underpins all of those… I like to provide people with ‘framing thoughts.’ If you think of ethics, think of it in terms of the three legged stool: principles, application of those principles and then enforcement. We’re doing okay on the first leg, a little bit better on the second and we need to do a lot of work on the third.”

We’ll end this brief review of the hour-long discussion with a comment from Ben Rothke, author and senior information security specialist at marketing consulting firm Tapad, who placed matters of ethics and security squarely at the feet of senior vendor management.

“When you look at it from a corporate governance perspective, when a large organization has record profits, we see senior management often needs to reap the (financial) benefits there,” he said. “Then it needs to trickle down also that they’re responsible for the safety, the information security. And often the CEO in the aviation sector, it’s a matter of does his organization have a safety culture? Where there’s a lot of accidents with a lot of incidents it’s because management didn’t develop a safety culture. And so too that works in information, security, privacy and everything. If management takes it seriously, if it’s an imperative to them, then it will trickle down. So really, at the end of the day, everything starts at the top, you have to build that culture. And where that exists, it will trickle down. If they take it seriously, the rest of the organization will take it seriously.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

University of Stuttgart Inaugurates ‘Hawk’ Supercomputer

February 20, 2020

This week, the new “Hawk” supercomputer was inaugurated in a ceremony at the High-Performance Computing Center of the University of Stuttgart (HLRS). Officials, scientists and other stakeholders celebrated the new sy Read more…

By Staff report

US to Triple Its Supercomputing Capacity for Weather and Climate with Two New Crays

February 20, 2020

The blizzard of news around the race for weather and climate supercomputing leadership continues. Just three days after the UK announced a £1.2 billion plan to build the world’s largest weather and climate supercomputer, the U.S. National Oceanic and Atmospheric Administration... Read more…

By Oliver Peckham

Indiana University Researchers Use Supercomputing to Model the State’s Largest Watershed

February 20, 2020

With water stressors on the rise, understanding and protecting water supplies is more important than ever. Now, a team of researchers from Indiana University has created a new climate change data portal to help Indianans Read more…

By Staff report

TACC – Supporting Portable, Reproducible, Computational Science with Containers

February 20, 2020

Researchers who use supercomputers for science typically don't limit themselves to one system. They move their projects to whatever resources are available, often using many different systems simultaneously, in their lab Read more…

By Aaron Dubrow

China Researchers Set Distance Record in Quantum Memory Entanglement

February 20, 2020

Efforts to develop the necessary capabilities for building a practical ‘quantum-based’ internet have been ongoing for years. One of the biggest challenges is being able to maintain and manage entanglement of remote q Read more…

By John Russell

AWS Solution Channel

Challenging the barriers to High Performance Computing in the Cloud

Cloud computing helps democratize High Performance Computing by placing powerful computational capabilities in the hands of more researchers, engineers, and organizations who may lack access to sufficient on-premises infrastructure. Read more…

IBM Accelerated Insights

Intelligent HPC – Keeping Hard Work at Bay(es)

Since the dawn of time, humans have looked for ways to make their lives easier. Over the centuries human ingenuity has given us inventions such as the wheel and simple machines – which help greatly with tasks that would otherwise be extremely laborious. Read more…

New Algorithm Allows PCs to Challenge HPC in Weather Forecasting

February 19, 2020

Accurate weather forecasting has, by and large, been situated squarely in the domain of high-performance computing – just this week, the UK announced a nearly $1.6 billion investment in the world’s largest supercompu Read more…

By Oliver Peckham

US to Triple Its Supercomputing Capacity for Weather and Climate with Two New Crays

February 20, 2020

The blizzard of news around the race for weather and climate supercomputing leadership continues. Just three days after the UK announced a £1.2 billion plan to build the world’s largest weather and climate supercomputer, the U.S. National Oceanic and Atmospheric Administration... Read more…

By Oliver Peckham

Japan’s AIST Benchmarks Intel Optane; Cites Benefit for HPC and AI

February 19, 2020

Last April Intel released its Optane Data Center Persistent Memory Module (DCPMM) – byte addressable nonvolatile memory – to increase main memory capacity a Read more…

By John Russell

UK Announces £1.2 Billion Weather and Climate Supercomputer

February 19, 2020

While the planet is heating up, so is the race for global leadership in weather and climate computing. In a bombshell announcement, the UK government revealed p Read more…

By Oliver Peckham

The Massive GPU Cloudburst Experiment Plays a Smaller, More Productive Encore

February 13, 2020

In November, researchers at the San Diego Supercomputer Center (SDSC) and the IceCube Particle Astrophysics Center (WIPAC) set out to break the internet – or Read more…

By Oliver Peckham

Eni to Retake Industry HPC Crown with Launch of HPC5

February 12, 2020

With the launch of its Dell-built HPC5 system, Italian energy company Eni regains its position atop the industrial supercomputing leaderboard. At 52-petaflops p Read more…

By Tiffany Trader

Trump Budget Proposal Again Slashes Science Spending

February 11, 2020

President Donald Trump’s FY2021 U.S. Budget, submitted to Congress this week, again slashes science spending. It’s a $4.8 trillion statement of priorities, Read more…

By John Russell

Policy: Republicans Eye Bigger Science Budgets; NSF Celebrates 70th, Names Idea Machine Winners

February 5, 2020

It’s a busy week for science policy. Yesterday, the National Science Foundation announced winners of its 2026 Idea Machine contest seeking directions for futu Read more…

By John Russell

Fujitsu A64FX Supercomputer to Be Deployed at Nagoya University This Summer

February 3, 2020

Japanese tech giant Fujitsu announced today that it will supply Nagoya University Information Technology Center with the first commercial supercomputer powered Read more…

By Tiffany Trader

Julia Programming’s Dramatic Rise in HPC and Elsewhere

January 14, 2020

Back in 2012 a paper by four computer scientists including Alan Edelman of MIT introduced Julia, A Fast Dynamic Language for Technical Computing. At the time, t Read more…

By John Russell

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing compon Read more…

By Tiffany Trader

SC19: IBM Changes Its HPC-AI Game Plan

November 25, 2019

It’s probably fair to say IBM is known for big bets. Summit supercomputer – a big win. Red Hat acquisition – looking like a big win. OpenPOWER and Power processors – jury’s out? At SC19, long-time IBMer Dave Turek sketched out a different kind of bet for Big Blue – a small ball strategy, if you’ll forgive the baseball analogy... Read more…

By John Russell

Intel Debuts New GPU – Ponte Vecchio – and Outlines Aspirations for oneAPI

November 17, 2019

Intel today revealed a few more details about its forthcoming Xe line of GPUs – the top SKU is named Ponte Vecchio and will be used in Aurora, the first plann Read more…

By John Russell

IBM Unveils Latest Achievements in AI Hardware

December 13, 2019

“The increased capabilities of contemporary AI models provide unprecedented recognition accuracy, but often at the expense of larger computational and energet Read more…

By Oliver Peckham

SC19: Welcome to Denver

November 17, 2019

A significant swath of the HPC community has come to Denver for SC19, which began today (Sunday) with a rich technical program. As is customary, the ribbon cutt Read more…

By Tiffany Trader

Fujitsu A64FX Supercomputer to Be Deployed at Nagoya University This Summer

February 3, 2020

Japanese tech giant Fujitsu announced today that it will supply Nagoya University Information Technology Center with the first commercial supercomputer powered Read more…

By Tiffany Trader

51,000 Cloud GPUs Converge to Power Neutrino Discovery at the South Pole

November 22, 2019

At the dead center of the South Pole, thousands of sensors spanning a cubic kilometer are buried thousands of meters beneath the ice. The sensors are part of Ic Read more…

By Oliver Peckham

Leading Solution Providers

SC 2019 Virtual Booth Video Tour

AMD
AMD
ASROCK RACK
ASROCK RACK
AWS
AWS
CEJN
CJEN
CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
IBM
IBM
MELLANOX
MELLANOX
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
SIX NINES IT
SIX NINES IT
VERNE GLOBAL
VERNE GLOBAL
WEKAIO
WEKAIO

Jensen Huang’s SC19 – Fast Cars, a Strong Arm, and Aiming for the Cloud(s)

November 20, 2019

We’ve come to expect Nvidia CEO Jensen Huang’s annual SC keynote to contain stunning graphics and lively bravado (with plenty of examples) in support of GPU Read more…

By John Russell

Top500: US Maintains Performance Lead; Arm Tops Green500

November 18, 2019

The 54th Top500, revealed today at SC19, is a familiar list: the U.S. Summit (ORNL) and Sierra (LLNL) machines, offering 148.6 and 94.6 petaflops respectively, Read more…

By Tiffany Trader

Azure Cloud First with AMD Epyc Rome Processors

November 6, 2019

At Ignite 2019 this week, Microsoft's Azure cloud team and AMD announced an expansion of their partnership that began in 2017 when Azure debuted Epyc-backed instances for storage workloads. The fourth-generation Azure D-series and E-series virtual machines previewed at the Rome launch in August are now generally available. Read more…

By Tiffany Trader

Intel’s New Hyderabad Design Center Targets Exascale Era Technologies

December 3, 2019

Intel's Raja Koduri was in India this week to help launch a new 300,000 square foot design and engineering center in Hyderabad, which will focus on advanced com Read more…

By Tiffany Trader

In Memoriam: Steve Tuecke, Globus Co-founder

November 4, 2019

HPCwire is deeply saddened to report that Steve Tuecke, longtime scientist at Argonne National Lab and University of Chicago, has passed away at age 52. Tuecke Read more…

By Tiffany Trader

IBM Debuts IC922 Power Server for AI Inferencing and Data Management

January 28, 2020

IBM today launched a Power9-based inference server – the IC922 – that features up to six Nvidia T4 GPUs, PCIe Gen 4 and OpenCAPI connectivity, and can accom Read more…

By John Russell

Cray Debuts ClusterStor E1000 Finishing Remake of Portfolio for ‘Exascale Era’

October 30, 2019

Cray, now owned by HPE, today introduced the ClusterStor E1000 storage platform, which leverages Cray software and mixes hard disk drives (HDD) and flash memory Read more…

By John Russell

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This