Richards looked at the console. “How’s Martha?”
“Martha’s fine, I think. It’s just George.”
“Which George is it?”
“Saint George,” Gerhard said. “Really a bitch.”
Richards sipped his coffee and sat down at the console. “Mind if I try it?”
“Sure,” Gerhard said.
Richards began flicking buttons. He called up the program for Saint George. Then he called up the program for Martha. Then he pushed the interaction button.
Richards and Gerhard hadn’t devised these programs; they were modified from several existing computer programs developed at other universities. But the basic idea was the same—to create a computer program that wouldmake the computer act emotionally, like people. It was logical to designate the programs with names like George and Martha. There was a precedent for that: Eliza in Boston, and Aldous in England.
George and Martha were essentially the same program with slight differences. The original George was programmed to be neutral in his response to stimuli. Then Martha was created. Martha was a little bitchy; Martha disliked most things. Finally, another George was formulated, a very loving George, who was referred to as Saint George.
Each program could respond with three emotional states—love, fear, and anger. Each could produce three actions—approach, withdrawal, and attack. All this was, of course, highly abstract. It was carried out in terms of numbers. For example, the original George was neutral to most numbers, but disliked the number 751. He was programmed to dislike it. And by extension he disliked similar numbers—743, 772, and so on. He much preferred numbers such as 404, 133, and 918. If you punched in one of these numbers, George responded with numbers signifying love and approach. If you punched in 707, George withdrew. If you punched in 750, George angrily attacked—according to the numbers he printed out.
The NPS staff had played with the programs in this way for a long time. Then they had incorporated program modifications to allow for “talk” from the computer. The numbers were translated into sentences. This was amusing, and revealing. The actual interaction was referred to as “the Christmas game” because most of it was conducted as giving and receiving gifts—objectsthat had an assigned or learned emotional value, just as the numbers did.
Normal George interacting with Martha would eventually win her over, and her bitchiness would recede into the background.
But Saint George had a much worse effect on her. His loving acceptance drove her up the wall. That is, if things were working normally. Richards watched as the print-out flashed across the screen.
HELLO I’M SAINT GEORGE
HELLO
WHAT IS YOUR NAME?
MARTHA .
YOU DON’T SAY MUCH, MARTHA .
NO
BUT I LIKE YOU .
WHY I DON’T LIKE YOU .
I UNDERSTAND THAT. HOW CAN I HELP?
“It isn’t printing periods consistently,” Richards said. “When did that start?”
“That’s nothing,” Gerhard said. “Just keep watching.”
HELP?
YES, I WOULD LIKE TO HELP YOU .
HAVE A CAT .
Richards winced. Cats were programmed to be something that both George and Martha disliked. Giving George a cat was a very hostile move.
THANK YOU
HAVE ANOTHER CAT .
THANK YOU YOU ARE VERY KIND .
HAVE ANOTHER CAT .
NO THANK YOU .
DON’T YOU LIKE CATS?
YES BUT I HAVE ENOUGH CATS .
HAVE A GORILLA .
“Really nasty,” Richards said. “What are the other interactions like?”
“Pretty much the same,” Gerhard said.
“As bad as this?”
“Pretty much.”
They were accustomed now to the idea that when the computers interacted, the results were unpredictable. The general form of interaction could be guessed, but the specific results were uncertain. It was like a computer playing checkers—it was probably going to win, but it would win a different way each time, depending on what the opponent did.
I DON’T WANT A GORILLA .
DON’T YOU LIKE GORILLAS?
NO THANK YOU .
YOU ARE TOO POLITE .
I
Michele Mannon
Jason Luke, Jade West
Harmony Raines
Niko Perren
Lisa Harris
Cassandra Gannon
SO
Kathleen Ernst
Laura Del
Collin Wilcox