Are You Actually an Android?

By: Dave Roos
A scene from HBO's "Westworld." John P. Johnson/HBO

The hit HBO series "Westworld" imagines a futuristic Old West theme park inhabited by a cast of strikingly humanoid robots. People visit Westworld to act out their darkest cowboy fantasies, sleeping with seductive robo-harlots and shooting up bearded bad guys — or good guys, if that's your thing.

The trouble begins when a handful of Westworld's android hosts start to "forget" that they're really machines and yearn to break free from the hideous treatment inflicted by the guests. All of which raises an interesting question: If an android is so highly evolved that it thinks like a human, laughs like a human, hurts like a human and even loves like a human, then where is the line between man and machine?

Advertisement

Watching a mind-bending show like "Westworld" or other near-future artificial intelligence (AI) fantasies like "Ex Machina" and "Her," you might even start to wonder, "Could I actually be a robot? How would I even know?"

"In my view, what we think of as consciousness is not unique to humanity," says David Atkinson, senior research scientist with the Florida Institute for Human & Machine Cognition. "Machines will someday behave as if they are conscious. They may even claim, as we do, to be self-aware. How could we prove them wrong? I believe that you are conscious because I believe you are like me, and I believe I am conscious."

Atkinson previously worked for NASA at the Jet Propulsion Laboratory overseeing the space agency's basic research programs in artificial intelligence and robotics. Like many of his AI colleagues, Atkinson sees the human brain as nothing more (or less) than an electrochemical supercomputer: "A very very sophisticated computer with complexity that we dream of understanding one day," he says

Most of us believe that self-awareness is proof of our humanity. "I think, therefore I am," as Descartes wrote. But that depends on how you define "thought." Some would argue that our best ideas and deepest desires can't be separated from the flesh-and-blood computer that creates and stores them.

"Your brain is composed of 100 billion neurons," says Jeff Clune, director of the Evolving Artificial Intelligence Lab at the University of Wyoming. "Which neurons are connected with which other neurons determines whether you prefer Shakespeare or USA Today, whether you fall in love, whether you prefer chocolate versus vanilla ice cream. Everything that is 'you' is contained in this fantastically complex tangle of neurons."

Clune believes that it's "inevitable" that we will one day create AI that rivals human intelligence and attains true consciousness. Computer scientists have already designed artificial neural networks that enable machines to autonomously learn in the same way that a child learns, by processing information from the world around them.

Clune's own lab designed a deep neural network that allowed a machine to learn how to recognize random images and then generate its own artistic renderings. Google tapped similar technology for its Deep Dream Generator.

"The idea is, if we get enough of these virtual neurons and we wire them up in the right way, we'll be able to produce true artificial intelligence just like it exists in humans," Clune says. At that point, the differences between brain and computer, thought and computation, mind and machine, will all be semantics.

"Why should a machine forget it is a machine any more than a person forgets they are human?"
David Atkinson, Florida Institute for Human & Machine Cognition

In a scene from a recent episode of "Westworld", an android host named Maeve (played by Thandie Newton) interrogates a human lab tech, Felix (Leonardo Nam), as he tries to explain that everything she says and does has been programmed by the "people upstairs."

"We are the same these days, for the most part," says Felix to Maeve. "One big difference, though. The processing power in here [touching her head] is way beyond what we [humans] have. It's got one drawback, though."

"What's that?" asks Maeve.

"You're under our control."

Incredibly, this is one part of "Westworld" that AI researcher Clune finds fantastical. Not that we could build an android as convincingly human as Maeve, but that we could exercise control over a machine of equal or greater intelligence than ourselves.

"Current machine learning and AI research is based on the idea that we do not know how to program real intelligence," Clune says. "We create learning algorithms that allow these entities to learn on their own. But then they go off and read their own books and watch their own videos. We know how to create learning algorithms that allow AI to learn, but we don't have fine-tuned control over what it learns and how it thinks and what it pays attention to and what it doesn't."

And what about the idea that a machine could forget that it's a machine?

"Why should a machine forget it is a machine any more than a person forgets they are human?" says Atkinson. "They will not be born, grow up in a family, have grade school friends, and so on. No human experiences. They will have machine experiences. They will be very different from us that way, but I expect we will get along just fine."

So while it's increasingly probable that you will live to see the rise of autonomous intelligent robots, it's highly unlikely that you are one of them. Phew. 

Advertisement

Advertisement

Loading...