By Sid Karin, NPACI Director
Washington, D.C. — Bill Joy is worried about the future. Joy, the co-founder and chief scientist of Sun Microsystems, who also co- chaired the President’s Information Technology Advisory Committee, is worried about whether there will be any future at all for humanity, in view of the destructive possibilities that seem to him to be contained within new technologies – specifically, genetics, nanotechnology, and robotics. He expressed his worries in a long feature article in the April 2000 issue of Wired magazine, titled “Why the Future Doesn’t Need Us.” His article is getting a lot of buzz. He has voiced the same concerns in a number of forums sponsored by the Aspen Institute, and in a recent CNN segment in which he was interviewed on the subject. His newly discovered anguish is clearly striking a chord.
Let me say at the outset that I disagree with most of his analyses and with many of the pundits to whom he appeals for support in his arguments. Yet I believe that he has begun an invaluable discussion, one to which all of us in science and engineering have a responsibility to contribute. I do not believe that the avatars of technology are inevitably creating monsters, whether Frankenfoods, tiny self-replicating destroyers, or a race of antagonistic and intelligent robots. Joy asserts that genetic, nano-, and robo-technologies are more dangerous than the nuclear, biowarfare, and chemical warfare technologies. While the latter also threaten humanity, the former are supposedly even worse, because they are “more accessible” and “self-replicating.”
Why name only these? The very technology to which both Bill Joy and I are closest – computer technology – has already produced complex moral and ethical dilemmas. The issue of privacy is only one of these, and it takes us very quickly from one hard question to the next. For example, veterinarians now regularly implant ID chips in people’s pets. Fancier chips are being implanted in various fauna for ecological research purposes – chips that enable location of an animal via the Global Positioning System. Should we continue to do animal implants? What about implanting chips in small children? What about being able to locate everyone, all the time?
What is right or wrong here? What is acceptable and what is unacceptable, and to whom? More important, how should we who develop these technologies view our own responsibilities? Who should decide what the role of developers is in the process of adopting and adapting new technologies? Are we, as developers, not obligated to define our roles, or at least to comment? I believe we are, and I think that is what is so important in Bill Joy’s speaking out.
Those of us who develop new technology have a professional responsibility to consider the social implications of what we do. Joy notes that our technologies also contain “untold promise.” We should be telling that promise, making the benefits explicit, as well as worrying about the downside.
Yes, we should worry about misuse and abuse of genetics, nanotechnology, and robotics. Moreover, we should worry still about nuclear proliferation and chemical and biological weapons, however difficult these technologies may be. While we’re at it, we should ask whether an unwillingness to confront promise as well as peril is keeping us from making the best use of technologies we already have in hand. I am thinking here, for example, of nuclear power. Such questions might well be revisited in light of the planet’s ever more acute energy concerns.
If we scientists and engineers leave the future of humanity to professional ethicists, politicians, pundits, patent attorneys, and a phalanx of popular poseurs, all pronouncing upon the evils we may have done, we will be guilty of shirking a basic social and professional responsibility. In that case, we will deserve what we get. Bill Joy is right to raise his concerns. Let us join the debate!
For more information on NPACI (National Partnership for Advanced Computational Infrastructure) see http://www.npaci.edu