At SC19: Who’s Accountable ‘When Technology Kills’?

By Doug Black

December 2, 2019

“One of the things we are witnessing is the compute requirement for (AI) training jobs is doubling every three-and-a-half months. So we were very impressed with Moore’s Law doubling every 18 months, right? This thing is doubling every three and a half months. Obviously, it’s unsustainable. If we keep at that rate for sustained periods of time, we will consume every piece of energy the world has just to do this.”

– Dario Gil, director, IBM Research

The relationship between humans and machines is inarguably one of the great issues of this century. Machine intelligence is rising at supersonic speed (see above); human intelligence is, at best, rising slowly. Machines, though bereft of EQ, can apply unimaginably, inhumanly high (and getting higher) IQ to its assigned tasks; human EQ can be nurtured, but our ability to comprehend and synthesize large amounts complex information has limits. Machines are capable of handling larger spheres of our work and personal lives; people are ceding more personal and work tasks to machines. These trend lines are firmly in place and – as things stand now – will only accelerate.

“When Technology Kills,” the conference plenary at last week’s SC19 in Denver, comprised of three technology intellectuals, took up thorny AI-related questions around how much autonomy ML, AI, IoT and smart data should be given, regulations for governing software development and data privacy, additional training people should receive to manage new software systems, and ultimately, who is responsible when software fails, violates our legal rights, causes property damage, injury or loss of life?

One would assume these questions weigh heavily on the minds of the HPC-AI community, many of whose most capable and influential technologists were gathered at SC19. But the plenary session attracted a relatively light turnout – tending to confirm the perception that the AI industry is far more focused on developing systems with superhuman capabilities than on the social and ethical implications those systems pose.

To be fair, technologists above all are tasked with building powerful technology that disrupts and drives their organizations to first mover status. It’s also true that philosophical discussions of AI ethics can be somewhat ethereal, light on actionable insight. But a growing chorus of voices are being raised in industry analyst, academic, political and journalistic circles about controlling the ways in which humans and machines will co-exist in the decades to come. Their warning to the technology industry: Avoid issues of AI ethics and accountability at your own peril.

In fact, the common current running through the plenary discussion centered more on people than machines, and on the need to balance the legitimate though often conflicting interests of technology’s constituent groups.

SC19 Plenary panel, from left: Keri Savoca, freelance technical writer; Eric Hunter, Bradford & Barthel; Erin Kenneally, Elchemy; Ben Rothke, Tapad

As Eric Hunter, futurist and director of knowledge, innovation and technology strategies at consulting firm Bradford & Barthel LLP, said, “Oftentimes, people will focus on the technology itself. (But) it really comes down to the individuals involved, the humans involved. That can sound profoundly redundant, but I’m saying that because you can’t divorce human behavior and technology. And it’s about the individuals that are adapting to these technologies, creating them, interacting with them. And then when something fails, what was the decisions that were made? And what can be learned from the individuals involved?”

Early in the plenary, Erin Kenneally, CEO of advisory group Elchemy, offered a coda for the session:

“I think, when you use the terms responsibility and blame, implicit in that is this notion of lack of trust. And I think one of the major problems that we’re facing now is this gap between our technology and our laws. I (call it) a gap between our expectations and our capabilities… Imagine a graph and it’s… got two lines and the upper line is very steeply sloping, approximating Moore’s Law. And that represents the rate of change of technology capabilities… At the bottom, you’ve got, not quite flatlining but slightly up-slope is a line that represents our laws, and that is our expectations. So there’s this gap … and this is where we have conflicts of rights and interests.“Let’s take deploying AI edge devices to monitor your behavior. The issue is my rights. My rights and my interest in privacy may conflict with my employer’s right and interest in the security of their enterprise or their commercial free speech rights. And that may conflict with my fellow citizens’ interest in their own security and privacy, which may conflict with the government’s interests in securing critical infrastructure. We see these instances all over the place.

“Like I mentioned with AI, we’ve got recommender systems, scoring systems, classification systems that are all trying to build predictive models about us. At the end of the day, I think it’s important to realize that technology is no longer just providing affordances. It’s not just spell checking our documents. It’s actually making decisions and taking actions by and for and with us, and those impact our rights and interests. And oftentimes, those decisions and those actions are being done in a very asymmetrical, opaque manner. And they have impact and we’re not certain of those impacts, sometimes. So we’re dealing with this widening delta between our capabilities and our expectations. And I think that’s what we have to worry about.

“I think there are at least three consequences…, you get increasing tensions between legitimate stakeholders. It’s easy to say good guy versus bad guy we know someone’s right and someone’s in the wrong, but when you’ve got good guy versus good guy versus good guy, how do you resolve those issues? You’ve also got an inefficient avoidance of risk problems as well if organizations don’t know if what they’re doing is violating the law. And then finally, you’ve got an undermining of ordering forces, people either receding from the marketplace and not trusting technology, or going rogue and extrajudicial and taking the law into their own hands. And we don’t, we don’t want any of those situations to happen.

“There are at least five ‘trust mechanisms’ that we can rely on.

“Number one, we need incentives to build secure software and not just race to time-to-market pressures. We need to rely on responsible research and development. I think that’s critical.

“Secondly, we need to do a better job of getting real-world, longitudinal, large scale data in front of researchers and developers to test and evaluate their technologies. …You can build the greatest algorithm since sliced cheese but if you don’t have good data going in you’re going to get bad results.

“We need to do a better job from a smart governance perspective with regard to the collection, use and disclosure of data, and we need to be more innovative from a legal perspective with regard to liability and holding people responsible.

“We need to have better convening and coordinating between academia, research and the private sector around standards and best practices.

And then finally, I think ethics underpins all of those… I like to provide people with ‘framing thoughts.’ If you think of ethics, think of it in terms of the three legged stool: principles, application of those principles and then enforcement. We’re doing okay on the first leg, a little bit better on the second and we need to do a lot of work on the third.”

We’ll end this brief review of the hour-long discussion with a comment from Ben Rothke, author and senior information security specialist at marketing consulting firm Tapad, who placed matters of ethics and security squarely at the feet of senior vendor management.

“When you look at it from a corporate governance perspective, when a large organization has record profits, we see senior management often needs to reap the (financial) benefits there,” he said. “Then it needs to trickle down also that they’re responsible for the safety, the information security. And often the CEO in the aviation sector, it’s a matter of does his organization have a safety culture? Where there’s a lot of accidents with a lot of incidents it’s because management didn’t develop a safety culture. And so too that works in information, security, privacy and everything. If management takes it seriously, if it’s an imperative to them, then it will trickle down. So really, at the end of the day, everything starts at the top, you have to build that culture. And where that exists, it will trickle down. If they take it seriously, the rest of the organization will take it seriously.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

What’s After Exascale? The Internet of Workflows Says HPE’s Nicolas Dubé

July 29, 2021

With the race to exascale computing in its final leg, it’s natural to wonder what the Post Exascale Era will look like. Nicolas Dubé, VP and chief technologist for HPE’s HPC business unit, agrees and shared his vision at Supercomputing Frontiers Europe 2021 held last week. The next big thing, he told the virtual audience at SFE21, is something that will connect HPC and (broadly) all of IT – into what Dubé calls The Internet of Workflows. Read more…

How UK Scientists Developed Transformative, HPC-Powered Coronavirus Sequencing System

July 29, 2021

In November 2020, the COVID-19 Genomics UK Consortium (COG-UK) won the HPCwire Readers’ Choice Award for Best HPC Collaboration for its CLIMB-COVID sequencing project. Launched in March 2020, CLIMB-COVID has now resulted in the sequencing of over 675,000 coronavirus genomes – an increasingly critical task as variants like Delta threaten the tenuous prospect of a return to normalcy in much of the world. Read more…

KAUST Leverages Mixed Precision for Geospatial Data

July 28, 2021

For many computationally intensive tasks, exacting precision is not necessary for every step of the entire task to obtain a suitably precise result. The alternative is mixed-precision computing: using high precision wher Read more…

Oak Ridge Supercomputer Enables Next-Gen Jet Turbine Research

July 27, 2021

Air travel is notoriously carbon-inefficient, with many airlines going as far as to offer purchasable carbon offsets to ease the guilt over large-footprint travel. But even over just the last decade, major aircraft model Read more…

IBM and University of Tokyo Roll Out Quantum System One in Japan

July 27, 2021

IBM and the University of Tokyo today unveiled an IBM Quantum System One as part of the IBM-Japan quantum program announced in 2019. The system is the second IBM Quantum System One assembled outside the U.S. and follows Read more…

AWS Solution Channel

Data compression with increased performance and lower costs

Many customers associate a performance cost with data compression, but that’s not the case with Amazon FSx for Lustre. With FSx for Lustre, data compression reduces storage costs and increases aggregate file system throughput. Read more…

Intel Unveils New Node Names; Sapphire Rapids Is Now an ‘Intel 7’ CPU

July 27, 2021

What's a preeminent chip company to do when its process node technology lags the competition by (roughly) one generation, but outmoded naming conventions make it seem like it's two nodes behind? For Intel, the response was to change how it refers to its nodes with the aim of better reflecting its positioning within the leadership semiconductor manufacturing space. Intel revealed its new node nomenclature, and... Read more…

What’s After Exascale? The Internet of Workflows Says HPE’s Nicolas Dubé

July 29, 2021

With the race to exascale computing in its final leg, it’s natural to wonder what the Post Exascale Era will look like. Nicolas Dubé, VP and chief technologist for HPE’s HPC business unit, agrees and shared his vision at Supercomputing Frontiers Europe 2021 held last week. The next big thing, he told the virtual audience at SFE21, is something that will connect HPC and (broadly) all of IT – into what Dubé calls The Internet of Workflows. Read more…

How UK Scientists Developed Transformative, HPC-Powered Coronavirus Sequencing System

July 29, 2021

In November 2020, the COVID-19 Genomics UK Consortium (COG-UK) won the HPCwire Readers’ Choice Award for Best HPC Collaboration for its CLIMB-COVID sequencing project. Launched in March 2020, CLIMB-COVID has now resulted in the sequencing of over 675,000 coronavirus genomes – an increasingly critical task as variants like Delta threaten the tenuous prospect of a return to normalcy in much of the world. Read more…

IBM and University of Tokyo Roll Out Quantum System One in Japan

July 27, 2021

IBM and the University of Tokyo today unveiled an IBM Quantum System One as part of the IBM-Japan quantum program announced in 2019. The system is the second IB Read more…

Intel Unveils New Node Names; Sapphire Rapids Is Now an ‘Intel 7’ CPU

July 27, 2021

What's a preeminent chip company to do when its process node technology lags the competition by (roughly) one generation, but outmoded naming conventions make it seem like it's two nodes behind? For Intel, the response was to change how it refers to its nodes with the aim of better reflecting its positioning within the leadership semiconductor manufacturing space. Intel revealed its new node nomenclature, and... Read more…

Will Approximation Drive Post-Moore’s Law HPC Gains?

July 26, 2021

“Hardware-based improvements are going to get more and more difficult,” said Neil Thompson, an innovation scholar at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL). “I think that’s something that this crowd will probably, actually, be already familiar with.” Thompson, speaking... Read more…

With New Owner and New Roadmap, an Independent Omni-Path Is Staging a Comeback

July 23, 2021

Put on a shelf by Intel in 2019, Omni-Path faced a uncertain future, but under new custodian Cornelis Networks, OmniPath is looking to make a comeback as an independent high-performance interconnect solution. A "significant refresh" – called Omni-Path Express – is coming later this year according to the company. Cornelis Networks formed last September as a spinout of Intel's Omni-Path division. Read more…

Chameleon’s HPC Testbed Sharpens Its Edge, Presses ‘Replay’

July 22, 2021

“One way of saying what I do for a living is to say that I develop scientific instruments,” said Kate Keahey, a senior fellow at the University of Chicago a Read more…

Summer Reading: “High-Performance Computing Is at an Inflection Point”

July 21, 2021

At last month’s 11th International Symposium on Highly Efficient Accelerators and Reconfigurable Technologies (HEART), a group of researchers led by Martin Schulz of the Leibniz Supercomputing Center (Munich) presented a “position paper” in which they argue HPC architectural landscape... Read more…

AMD Chipmaker TSMC to Use AMD Chips for Chipmaking

May 8, 2021

TSMC has tapped AMD to support its major manufacturing and R&D workloads. AMD will provide its Epyc Rome 7702P CPUs – with 64 cores operating at a base cl Read more…

Intel Launches 10nm ‘Ice Lake’ Datacenter CPU with Up to 40 Cores

April 6, 2021

The wait is over. Today Intel officially launched its 10nm datacenter CPU, the third-generation Intel Xeon Scalable processor, codenamed Ice Lake. With up to 40 Read more…

Berkeley Lab Debuts Perlmutter, World’s Fastest AI Supercomputer

May 27, 2021

A ribbon-cutting ceremony held virtually at Berkeley Lab's National Energy Research Scientific Computing Center (NERSC) today marked the official launch of Perlmutter – aka NERSC-9 – the GPU-accelerated supercomputer built by HPE in partnership with Nvidia and AMD. Read more…

Ahead of ‘Dojo,’ Tesla Reveals Its Massive Precursor Supercomputer

June 22, 2021

In spring 2019, Tesla made cryptic reference to a project called Dojo, a “super-powerful training computer” for video data processing. Then, in summer 2020, Tesla CEO Elon Musk tweeted: “Tesla is developing a [neural network] training computer called Dojo to process truly vast amounts of video data. It’s a beast! … A truly useful exaflop at de facto FP32.” Read more…

Google Launches TPU v4 AI Chips

May 20, 2021

Google CEO Sundar Pichai spoke for only one minute and 42 seconds about the company’s latest TPU v4 Tensor Processing Units during his keynote at the Google I Read more…

CentOS Replacement Rocky Linux Is Now in GA and Under Independent Control

June 21, 2021

The Rocky Enterprise Software Foundation (RESF) is announcing the general availability of Rocky Linux, release 8.4, designed as a drop-in replacement for the soon-to-be discontinued CentOS. The GA release is launching six-and-a-half months after Red Hat deprecated its support for the widely popular, free CentOS server operating system. The Rocky Linux development effort... Read more…

CERN Is Betting Big on Exascale

April 1, 2021

The European Organization for Nuclear Research (CERN) involves 23 countries, 15,000 researchers, billions of dollars a year, and the biggest machine in the worl Read more…

Iran Gains HPC Capabilities with Launch of ‘Simorgh’ Supercomputer

May 18, 2021

Iran is said to be developing domestic supercomputing technology to advance the processing of scientific, economic, political and military data, and to strengthen the nation’s position in the age of AI and big data. On Sunday, Iran unveiled the Simorgh supercomputer, which will deliver.... Read more…

Leading Solution Providers

Contributors

HPE Launches Storage Line Loaded with IBM’s Spectrum Scale File System

April 6, 2021

HPE today launched a new family of storage solutions bundled with IBM’s Spectrum Scale Erasure Code Edition parallel file system (description below) and featu Read more…

Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?

January 13, 2021

The rapid adoption of Julia, the open source, high level programing language with roots at MIT, shows no sign of slowing according to data from Julialang.org. I Read more…

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

GTC21: Nvidia Launches cuQuantum; Dips a Toe in Quantum Computing

April 13, 2021

Yesterday Nvidia officially dipped a toe into quantum computing with the launch of cuQuantum SDK, a development platform for simulating quantum circuits on GPU-accelerated systems. As Nvidia CEO Jensen Huang emphasized in his keynote, Nvidia doesn’t plan to build... Read more…

Microsoft to Provide World’s Most Powerful Weather & Climate Supercomputer for UK’s Met Office

April 22, 2021

More than 14 months ago, the UK government announced plans to invest £1.2 billion ($1.56 billion) into weather and climate supercomputing, including procuremen Read more…

Quantum Roundup: IBM, Rigetti, Phasecraft, Oxford QC, China, and More

July 13, 2021

IBM yesterday announced a proof for a quantum ML algorithm. A week ago, it unveiled a new topology for its quantum processors. Last Friday, the Technical Univer Read more…

Q&A with Jim Keller, CTO of Tenstorrent, and an HPCwire Person to Watch in 2021

April 22, 2021

As part of our HPCwire Person to Watch series, we are happy to present our interview with Jim Keller, president and chief technology officer of Tenstorrent. One of the top chip architects of our time, Keller has had an impactful career. Read more…

Senate Debate on Bill to Remake NSF – the Endless Frontier Act – Begins

May 18, 2021

The U.S. Senate today opened floor debate on the Endless Frontier Act which seeks to remake and expand the National Science Foundation by creating a technology Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire