The Terminal Man - Crichton Michael. Страница 18

 

THANK YOU YOU ARE VERY KIND.

 

HAVE ANOTHER CAT.

 

NO THANK YOU.

 

DON'T YOU LIKE CATS?

 

YES BUT I HAVE ENOUGH CATS.

 

HAVE A GORILLA.

 

"Really nasty," Richards said. "What are the other interactions like?"

"'Pretty much the same," Gerhard said.

"As bad as this?"

"Pretty much."

They were accustomed now to the idea that when the computers interacted, the results were unpredictable. The general form of interaction could be guessed, but the specific results were uncertain. It was like a computer playing checkers - it was probably going to win, but it would win a different way each time, depending on what the opponent did.

I DON'T WANT A GORILLA. DON'T YOU LIKE GORILLAS? NO THANK YOU.

YOU ARE TOO POLITE.

 

I AM SORRY IF I OFFEND YOU IN ANY WAY.

 

YOU TALK TOO MUCH.

 

I AM SORRY.

 

HAVE AN EGGPLANT.

Richards watched with interest. Saint George had no background for eggplants. But he disliked cucumbers and bananas.

NO THANK YOU.

 

DON'T YOU LIKE EGGPLANT?

 

NOT VERY MUCH.

 

HERE HAVE ANOTHER ONE.

 

NO THANK YOU.

 

I WANT YOU TO HAE IT.

 

NO THANK YOU.

 

GO ON AND TAKE IT.

 

NO THANK YOU.

 

I INSIST.

 

NO THANK YOU.

"What's happened to Saint George?" Richards asked. "His responses are too much the same."

"That's what's bothering me."

"What's he cycling through on the program?"

"I was looking for it when you came in."

I INSIST THAT YOU HAVE A CUCUMBER.

 

I REFUSE.

"George!" Richards said, almost without thinking.

THEN HAVE A BANANA.

 

NO.

"George is breaking down," Richards said. "He's not a saint any more."

THEN HAVE BOTH A BANANA AND A CUCUMBER.

 

NO THANK YOU.

 

I INSIST.

GO TO HELL I WILL KILL YOU:::::::::

:::::::::::::::::::::::

:::::::::::::::::::::::

The screen was filled with white dots. "What does that mean, unprintable response?" Richards said.

"I don't know. I've never seen it before tonight."

"How many times has this program been run?" Richards asked.

"One hundred and ten, against Martha."

"Any learning erasures?"

"No."

"I'll be goddamned," Richards said. "He's getting to be a short-tempered saint." He grinned. "We can write this one up."

Gerhard nodded and went back to the print-out. In theory, what was happening was not puzzling. Both George and Martha were programmed to learn from experience. Like the checkers-playing programs - where the machine got better each time it played a game - this program was established so that the machine would "learn" new responses to things. After one hundred and ten sets of experience, Saint George had abruptly stopped being a saint. He was learning not to be a saint around Martha - even though he had been programmed for saintliness.

"I know just how he feels," Richards said, and switched the machine off. Then he joined Gerhard, looking for the programming error that had made it all possible.