At SC19: Who’s Accountable ‘When Technology Kills’?

By Doug Black

December 2, 2019

“One of the things we are witnessing is the compute requirement for (AI) training jobs is doubling every three-and-a-half months. So we were very impressed with Moore’s Law doubling every 18 months, right? This thing is doubling every three and a half months. Obviously, it’s unsustainable. If we keep at that rate for sustained periods of time, we will consume every piece of energy the world has just to do this.”

– Dario Gil, director, IBM Research

The relationship between humans and machines is inarguably one of the great issues of this century. Machine intelligence is rising at supersonic speed (see above); human intelligence is, at best, rising slowly. Machines, though bereft of EQ, can apply unimaginably, inhumanly high (and getting higher) IQ to its assigned tasks; human EQ can be nurtured, but our ability to comprehend and synthesize large amounts complex information has limits. Machines are capable of handling larger spheres of our work and personal lives; people are ceding more personal and work tasks to machines. These trend lines are firmly in place and – as things stand now – will only accelerate.

“When Technology Kills,” the conference plenary at last week’s SC19 in Denver, comprised of three technology intellectuals, took up thorny AI-related questions around how much autonomy ML, AI, IoT and smart data should be given, regulations for governing software development and data privacy, additional training people should receive to manage new software systems, and ultimately, who is responsible when software fails, violates our legal rights, causes property damage, injury or loss of life?

One would assume these questions weigh heavily on the minds of the HPC-AI community, many of whose most capable and influential technologists were gathered at SC19. But the plenary session attracted a relatively light turnout – tending to confirm the perception that the AI industry is far more focused on developing systems with superhuman capabilities than on the social and ethical implications those systems pose.

To be fair, technologists above all are tasked with building powerful technology that disrupts and drives their organizations to first mover status. It’s also true that philosophical discussions of AI ethics can be somewhat ethereal, light on actionable insight. But a growing chorus of voices are being raised in industry analyst, academic, political and journalistic circles about controlling the ways in which humans and machines will co-exist in the decades to come. Their warning to the technology industry: Avoid issues of AI ethics and accountability at your own peril.

In fact, the common current running through the plenary discussion centered more on people than machines, and on the need to balance the legitimate though often conflicting interests of technology’s constituent groups.

SC19 Plenary panel, from left: Keri Savoca, freelance technical writer; Eric Hunter, Bradford & Barthel; Erin Kenneally, Elchemy; Ben Rothke, Tapad

As Eric Hunter, futurist and director of knowledge, innovation and technology strategies at consulting firm Bradford & Barthel LLP, said, “Oftentimes, people will focus on the technology itself. (But) it really comes down to the individuals involved, the humans involved. That can sound profoundly redundant, but I’m saying that because you can’t divorce human behavior and technology. And it’s about the individuals that are adapting to these technologies, creating them, interacting with them. And then when something fails, what was the decisions that were made? And what can be learned from the individuals involved?”

Early in the plenary, Erin Kenneally, CEO of advisory group Elchemy, offered a coda for the session:

“I think, when you use the terms responsibility and blame, implicit in that is this notion of lack of trust. And I think one of the major problems that we’re facing now is this gap between our technology and our laws. I (call it) a gap between our expectations and our capabilities… Imagine a graph and it’s… got two lines and the upper line is very steeply sloping, approximating Moore’s Law. And that represents the rate of change of technology capabilities… At the bottom, you’ve got, not quite flatlining but slightly up-slope is a line that represents our laws, and that is our expectations. So there’s this gap … and this is where we have conflicts of rights and interests.“Let’s take deploying AI edge devices to monitor your behavior. The issue is my rights. My rights and my interest in privacy may conflict with my employer’s right and interest in the security of their enterprise or their commercial free speech rights. And that may conflict with my fellow citizens’ interest in their own security and privacy, which may conflict with the government’s interests in securing critical infrastructure. We see these instances all over the place.

“Like I mentioned with AI, we’ve got recommender systems, scoring systems, classification systems that are all trying to build predictive models about us. At the end of the day, I think it’s important to realize that technology is no longer just providing affordances. It’s not just spell checking our documents. It’s actually making decisions and taking actions by and for and with us, and those impact our rights and interests. And oftentimes, those decisions and those actions are being done in a very asymmetrical, opaque manner. And they have impact and we’re not certain of those impacts, sometimes. So we’re dealing with this widening delta between our capabilities and our expectations. And I think that’s what we have to worry about.

“I think there are at least three consequences…, you get increasing tensions between legitimate stakeholders. It’s easy to say good guy versus bad guy we know someone’s right and someone’s in the wrong, but when you’ve got good guy versus good guy versus good guy, how do you resolve those issues? You’ve also got an inefficient avoidance of risk problems as well if organizations don’t know if what they’re doing is violating the law. And then finally, you’ve got an undermining of ordering forces, people either receding from the marketplace and not trusting technology, or going rogue and extrajudicial and taking the law into their own hands. And we don’t, we don’t want any of those situations to happen.

“There are at least five ‘trust mechanisms’ that we can rely on.

“Number one, we need incentives to build secure software and not just race to time-to-market pressures. We need to rely on responsible research and development. I think that’s critical.

“Secondly, we need to do a better job of getting real-world, longitudinal, large scale data in front of researchers and developers to test and evaluate their technologies. …You can build the greatest algorithm since sliced cheese but if you don’t have good data going in you’re going to get bad results.

“We need to do a better job from a smart governance perspective with regard to the collection, use and disclosure of data, and we need to be more innovative from a legal perspective with regard to liability and holding people responsible.

“We need to have better convening and coordinating between academia, research and the private sector around standards and best practices.

And then finally, I think ethics underpins all of those… I like to provide people with ‘framing thoughts.’ If you think of ethics, think of it in terms of the three legged stool: principles, application of those principles and then enforcement. We’re doing okay on the first leg, a little bit better on the second and we need to do a lot of work on the third.”

We’ll end this brief review of the hour-long discussion with a comment from Ben Rothke, author and senior information security specialist at marketing consulting firm Tapad, who placed matters of ethics and security squarely at the feet of senior vendor management.

“When you look at it from a corporate governance perspective, when a large organization has record profits, we see senior management often needs to reap the (financial) benefits there,” he said. “Then it needs to trickle down also that they’re responsible for the safety, the information security. And often the CEO in the aviation sector, it’s a matter of does his organization have a safety culture? Where there’s a lot of accidents with a lot of incidents it’s because management didn’t develop a safety culture. And so too that works in information, security, privacy and everything. If management takes it seriously, if it’s an imperative to them, then it will trickle down. So really, at the end of the day, everything starts at the top, you have to build that culture. And where that exists, it will trickle down. If they take it seriously, the rest of the organization will take it seriously.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Intel’s Optane/DAOS Solution Tops Latest IO500

August 11, 2020

Intel’s persistent memory technology, Optane, and its DAOS (Distributed Asynchronous Object Storage) stack continue to impress and gain market traction. Yesterday, Intel reported an Optane and DAOS-based system finishe Read more…

By John Russell

Summit Now Offers Virtual Tours

August 10, 2020

Summit, the second most powerful publicly ranked supercomputer in the world, now has a virtual tour. The tour, implemented by 3D platform Matterport, allows users to virtually “walk” around the massive supercomputer Read more…

By Oliver Peckham

Supercomputer Simulations Examine Changes in Chesapeake Bay

August 8, 2020

The Chesapeake Bay, the largest estuary in the continental United States, weaves its way south from Maryland, collecting waters from West Virginia, Delaware, DC, Pennsylvania and New York along the way. Like many major e Read more…

By Oliver Peckham

Student Success from ‘Scratch’: CHPC’s Proof is in the Pudding

August 7, 2020

Happy Sithole, who directs the South African Centre for High Performance Computing (SA-CHPC), called the 13th annual CHPC National conference to order on December 1, 2019, at the Birchwood Conference Centre in Kempton Pa Read more…

By Elizabeth Leake

New GE Simulations on Summit to Advance Offshore Wind Power

August 6, 2020

The wind energy sector is a frequent user of high-power simulations, with researchers aiming to optimize wind flows and energy production from the massive turbines. Now, researchers at GE are preparing to undertake a lar Read more…

By Oliver Peckham

AWS Solution Channel

AWS announces the release of AWS ParallelCluster 2.8.0

AWS ParallelCluster is a fully supported and maintained open source cluster management tool that makes it easy for scientists, researchers, and IT administrators to deploy and manage High Performance Computing (HPC) clusters in the AWS cloud. Read more…

Intel® HPC + AI Pavilion

Supercomputing the Pandemic: Scientific Community Tackles COVID-19 from Multiple Perspectives

Since their inception, supercomputers have taken on the biggest, most complex, and most data-intensive computing challenges—from confirming Einstein’s theories about gravitational waves to predicting the impacts of climate change. Read more…

Research: A Survey of Numerical Methods Utilizing Mixed Precision Arithmetic

August 5, 2020

Within the past years, hardware vendors have started designing low precision special function units in response to the demand of the machine learning community and their demand for high compute power in low precision for Read more…

By Hartwig Anzt and Jack Dongarra

Intel’s Optane/DAOS Solution Tops Latest IO500

August 11, 2020

Intel’s persistent memory technology, Optane, and its DAOS (Distributed Asynchronous Object Storage) stack continue to impress and gain market traction. Yeste Read more…

By John Russell

Summit Now Offers Virtual Tours

August 10, 2020

Summit, the second most powerful publicly ranked supercomputer in the world, now has a virtual tour. The tour, implemented by 3D platform Matterport, allows use Read more…

By Oliver Peckham

Research: A Survey of Numerical Methods Utilizing Mixed Precision Arithmetic

August 5, 2020

Within the past years, hardware vendors have started designing low precision special function units in response to the demand of the machine learning community Read more…

By Hartwig Anzt and Jack Dongarra

Implement Photonic Tensor Cores for Machine Learning?

August 5, 2020

Researchers from George Washington University have reported an approach for building photonic tensor cores that leverages phase change photonic memory to implem Read more…

By John Russell

HPE Keeps Cray Brand Promise, Reveals HPE Cray Supercomputing Line

August 4, 2020

The HPC community, ever-affectionate toward Cray and its eponymous founder, can breathe a (virtual) sigh of relief. The Cray brand will live on, encompassing th Read more…

By Tiffany Trader

Machines, Connections, Data, and Especially People: OAC Acting Director Amy Friedlander Charts Office’s Blueprint for Innovation

August 3, 2020

The path to innovation in cyberinfrastructure (CI) will require continued focus on building HPC systems and secure connections between them, in addition to the Read more…

By Ken Chiacchia, Pittsburgh Supercomputing Center/XSEDE

Nvidia Said to Be Close on Arm Deal

August 3, 2020

GPU leader Nvidia Corp. is in talks to buy U.K. chip designer Arm from parent company Softbank, according to several reports over the weekend. If consummated Read more…

By George Leopold

Intel’s 7nm Slip Raises Questions About Ponte Vecchio GPU, Aurora Supercomputer

July 30, 2020

During its second-quarter earnings call, Intel announced a one-year delay of its 7nm process technology, which it says it will create an approximate six-month shift for its CPU product timing relative to prior expectations. The primary issue is a defect mode in the 7nm process that resulted in yield degradation... Read more…

By Tiffany Trader

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

Supercomputer-Powered Research Uncovers Signs of ‘Bradykinin Storm’ That May Explain COVID-19 Symptoms

July 28, 2020

Doctors and medical researchers have struggled to pinpoint – let alone explain – the deluge of symptoms induced by COVID-19 infections in patients, and what Read more…

By Oliver Peckham

Intel’s 7nm Slip Raises Questions About Ponte Vecchio GPU, Aurora Supercomputer

July 30, 2020

During its second-quarter earnings call, Intel announced a one-year delay of its 7nm process technology, which it says it will create an approximate six-month shift for its CPU product timing relative to prior expectations. The primary issue is a defect mode in the 7nm process that resulted in yield degradation... Read more…

By Tiffany Trader

Nvidia Said to Be Close on Arm Deal

August 3, 2020

GPU leader Nvidia Corp. is in talks to buy U.K. chip designer Arm from parent company Softbank, according to several reports over the weekend. If consummated Read more…

By George Leopold

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the do Read more…

By Oliver Peckham

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

By Doug Black

Neocortex Will Be First-of-Its-Kind 800,000-Core AI Supercomputer

June 9, 2020

Pittsburgh Supercomputing Center (PSC - a joint research organization of Carnegie Mellon University and the University of Pittsburgh) has won a $5 million award Read more…

By Tiffany Trader

HPE Keeps Cray Brand Promise, Reveals HPE Cray Supercomputing Line

August 4, 2020

The HPC community, ever-affectionate toward Cray and its eponymous founder, can breathe a (virtual) sigh of relief. The Cray brand will live on, encompassing th Read more…

By Tiffany Trader

Leading Solution Providers

Contributors

Nvidia’s Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI

May 14, 2020

Nvidia's first Ampere-based graphics card, the A100 GPU, packs a whopping 54 billion transistors on 826mm2 of silicon, making it the world's largest seven-nanom Read more…

By Tiffany Trader

Australian Researchers Break All-Time Internet Speed Record

May 26, 2020

If you’ve been stuck at home for the last few months, you’ve probably become more attuned to the quality (or lack thereof) of your internet connection. Even Read more…

By Oliver Peckham

15 Slides on Programming Aurora and Exascale Systems

May 7, 2020

Sometime in 2021, Aurora, the first planned U.S. exascale system, is scheduled to be fired up at Argonne National Laboratory. Cray (now HPE) and Intel are the k Read more…

By John Russell

‘Billion Molecules Against COVID-19’ Challenge to Launch with Massive Supercomputing Support

April 22, 2020

Around the world, supercomputing centers have spun up and opened their doors for COVID-19 research in what may be the most unified supercomputing effort in hist Read more…

By Oliver Peckham

Joliot-Curie Supercomputer Used to Build First Full, High-Fidelity Aircraft Engine Simulation

July 14, 2020

When industrial designers plan the design of a new element of a vehicle’s propulsion or exterior, they typically use fluid dynamics to optimize airflow and in Read more…

By Oliver Peckham

John Martinis Reportedly Leaves Google Quantum Effort

April 21, 2020

John Martinis, who led Google’s quantum computing effort since establishing its quantum hardware group in 2014, has left Google after being moved into an advi Read more…

By John Russell

$100B Plan Submitted for Massive Remake and Expansion of NSF

May 27, 2020

Legislation to reshape, expand - and rename - the National Science Foundation has been submitted in both the U.S. House and Senate. The proposal, which seems to Read more…

By John Russell

Google Cloud Debuts 16-GPU Ampere A100 Instances

July 7, 2020

On the heels of the Nvidia’s Ampere A100 GPU launch in May, Google Cloud is announcing alpha availability of the A100 “Accelerator Optimized” VM A2 instance family on Google Compute Engine. The instances are powered by the HGX A100 16-GPU platform, which combines two HGX A100 8-GPU baseboards using... Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This