By Luan Schooler, Director of New Play Development & Dramaturgy, Artists Rep
As the radius of knowledge expands, the circumference of ignorance increases. – Japanese saying
As our technological abilities continue to expand ever more rapidly, we have entered into entirely new ethical and moral territory. Human progress has always been driven by technological advancements, but it is only recently that we have had the ability to create beings that are not only like us but may indeed be better than us in terms of intelligence, strength and capability of many sorts. It is likely that revolutions now occurring in nanotechnology, genetic science, information technology and cognitive sciences will converge to alter the fundamental nature of being human, as well as our concept of what it is to be human.
We have been improving human abilities with technology for centuries. For example, first we had spectacles, then contact lenses, then laser surgery, all to improve vision; now, with augmented reality eyewear, we have expanded what vision itself can be. How long will it be before something can be implanted in the body for this purpose?
How many implants, genetic modifications or biomedical enhancements can we have before we are something other than human?
On the other side of the issue are the machine-beings that we are creating: software programs that successfully mimic humans on the phone; robots that replace humans in many occupations (not only factory workers, but also lawyers, financial services, doctors and such); programs writing poetry that a majority of people cannot distinguish from that written by humans; computers that are smarter and faster than any human, etc. The ability of machines to learn and expand their own capabilities autonomously is growing rapidly, and at some point they may cross into territory we have considered exclusively human.
If part of our human value comes from what we do—our actions, thoughts, judgments, creativity, etc.—then how do we relate to machine-beings that can functionally accomplish the same things? If a machine is more intelligent than a human being is, does it deserve more respect? Is biological ability necessarily better than technological?
We have not yet created beings that are fully autonomous, that are curious or imaginative, that are guided by their heart, that suffer, that have faith. But we are trying. Perhaps by wrestling with the questions raised by technology’s advance, we will come to understand more deeply what it is to be human.
Ten Ethical Issues Related to the Development of Artificial Life
- Labor – When many jobs can be performed through machine intelligence, employment for humans is reduced. How will unemployed humans earn a living?
- Inequality – Wealth will be concentrated in the hands of the few with ownership of artificial intelligence. How can income be distributed fairly?
- Humanity – We are already interacting with machines as if they are human in a variety of circumstances (e.g. – tech support). What is the risk to blurring the lines between humans and non-humans?
- Intelligence Errors – Machine learning is thus far limited by its design and input. How do we know programming is truly comprehensive?
- Programmed Bias – When human programmers are unaware of their own biases, will they create programs that reinforce unfair policies, ideas, and practices?
- Cybersecurity – Systems are vulnerable to illicit infiltration and results can be manipulated. How can we trust the information?
- Unintended Consequences – If a machine is given a goal to achieve without a complete understanding of the ramifications of each possible outcome, will the machine achieve the goal in an acceptable way?
- Liability – If a machine goes awry, who is responsible? The system designer? The builder? The robot itself?
- Greater intelligence – If machines become more intelligent than we are, will we be able to control them? And to what degree will it be appropriate to follow their advice rather than our own?
- AI Rights – When we create machines that have greater intelligence than our own, that learn from experience, that have their own reasons to carry on, will it be immoral or unethical to unplug them?
Marjorie Prime runs Feb 7 – Mar 5 at Artists Repertory Theatre