Sept. 24, 2001 --If you create a machine that is capable of
independent reasoning, have you created life? Do you have a responsibility to
that life or have you merely assembled another piece of clever hardware that
will be rendered obsolete by the next new thing?
In the Steven Spielberg-Stanley Kubrick film AI (as in
artificial intelligence), a robot manufacturer creates David, a
synthetic boy who is programmed to love. His human owner starts a program that
irreversibly fixes the cyberkid's affections on his owner.
By Aviva Patz
Hey, we all are. Hormones control just about every aspect of our physical
and mental health — and when they go off-kilter, they can trigger anything from
acne and insomnia to memory loss and weight gain. It's enough to ruin any
woman's day. Here, 6 common side effects of hormonal flux, plus how to balance
But by designing and building David, the robot maker has
created another Frankenstein's monster. The apparently self-aware
"mecha" (short for "mechanical") aches for love from his human
"mother" and yearns like Pinocchio to be made a "real" boy.
The film raises both intriguing and troubling philosophical
questions about what it means to be human, to have a sense of self, and to be a
unique, independent being worthy of respect and rights under the law.
When David, acting to save himself from the taunts and threats
of flesh-and-blood boys, accidentally injures his owners' son, he is abandoned
in the woods and left to fend for himself. He finds himself in the company of
freakish, broken, half-formed robots that stay "alive" by scavenging
spare parts from a dump.
But just because David cries and pleads to stay with the woman
he calls Mommy, and flees when he is tracked down by bounty hunters, are his
instincts of terror and self-preservation genuine, or are they merely a
brilliant mechanical and electronic simulation of how a real boy would respond?
Does it matter?
I Think Therefore I Am?
Nick Bostrom, PhD, a lecturer in philosophy at Yale University
in New Haven, Conn., says it does matter.
"I think that as soon as an entity becomes sentient --
capable of experiencing pain or pleasure -- it gets some sort of moral status,
just by virtue of being able to suffer," Bostrom tells WebMD. "Even
though animals don't have human rights -- and most of us think it's acceptable
to use them for medical research -- there are still limits. We don't allow
people to torture animals for no reason whatsoever."
Frank Sudia, JD, has slightly different criteria. He says the
ability to make and act on one or more choices out of multiple options, and the
ability to decide which of thousands of possibilities is the best one to use in
an unforeseen situation, may be a basic, working definition of what it means to
"If the machine has the power of self-production -- if it
can seek its own goals or even pick its own goals from some list of goals it
reads about in the newspaper [and decides], 'Oh, I want to look like Madonna,'
-- I think that this ability to choose, guided however it might be, is
indistinguishable from what we consider to be our sense of self," he tells