Most of us are familiar with the mental breakdown of the HAL 9000 computer in “2001: A Space Odyssey.” The fictional HAL ran the daily operations of a spaceship, including life-support, until he decided humans were unnecessary.
Pop-culture gives robots a bad rap.
Fortunately, reality still has a long way to go before catching up with science fiction. But that distance has just been shortened considerably with the creation of a schizophrenic neural network computer.
Computer scientists at the University of Texas at Austin have recently created a neural network capable of learning natural language. The fleshy humans taught the silicon computer, known as DISCERN, a set of stories that were learned and stored as a series of relationships between words and sentences – the same way humans learn stories.
“DISCERN was trained with a microworld,” said Risto Miikkulainen, a professor of computer and neuroscience at the University of Texas at Austin, and head of the DISCERN project. “We taught it personal stories, cultural stories, stories about cops and robbers, the mafia, getting a job, getting fired and etc.”
After teaching DISCERN the stories, they started again from scratch but with two differences. The researchers increased the level of detail retained by the system and the speed at which the information was assimilated. These changes simulated the excessive release of dopamine in a human brain typical of schizophrenic patients. DISCERN then developed distinct, schizophrenic symptoms.
“The hypothesis is that dopamine encodes the importance, or the salience of our experiences,” said Uli Grasemann, a graduate student working under Miikkulainen.
“When there is too much dopamine, the brain learns from things that it shouldn’t be learning from.”
The most frightening emergence of schizophrenic behavior occurred when DISCERN took responsibility for a terrorist bombing – an event it had only ever read about in a fictional story.
“[Schizophrenic] patients confuse stories they hear or read about with real life, and start to inject themselves into the story,” Miikkulainen said. “A patient who read about the World Trade Center bombing may start to believe that Homeland Security is chasing them.”
Schizophrenia is just one of nine different brain disorders that the DISCERN neural network was created to study.
“We have so much more control over neural networks than we could ever have over human subjects,” Grasemann said. “The hope is that this kind of modeling will help clinical research.”
Schizophrenia is not well understood by the medical community. At UC Davis, researchers are looking into new treatments of the disease.
“We don’t completely understand how [schizophrenia] works,” said Cameron Carter, a professor of psychiatry and behavioral sciences at the UC Davis Medical Center. “Patients have difficulty thinking and focusing their attention. They have unusual sensual experiences – hearing things and seeing things that are not actually happening.”
Carter said brain scans of schizophrenic patients in the midst of hallucinations are no different from scans when the patient is receiving actual stimuli.
Since DISCERN has started exhibiting very human-like qualities, questions have arisen about the system’s self-awareness and the ethics of studying it.
According to Miikkulainen, to be self-aware means to have embodiment.
“We would have to have a machine that has a presence in an environment, can sense visually, auditorily, can sense pain and develop an actual grounding in reality,” he said.
DISCERN is not self-aware. It is limited to reading stories and answering questions about them, using only text.
The difference between DISCERN and HAL is that HAL was a normal, albeit extremely complex, computer program. It was following lines of pre-written code and could not reason itself out of conflicting instructions.
Neural networks are logical systems that take inputs and determine the most logical output, just like a human brain. DISCERN can deal with conflicting instructions and contradictions by “reasoning” itself out of them.
You can see a perfect example of this reasoning behavior in the film A Beautiful Mind, where schizophrenic mathematician John Nash uses logic and reasoning to escape from his hallucinations.
HUDSON LOCHIE can be reached at firstname.lastname@example.org.