Communion Of Dreams


Easy to predict.

In Communion of Dreams, I have “experts” who are A.I. assistants. As I describe them in that book when I introduce one as the character ‘Seth’:

His expert was one of the best, one of only a few hundred based on the new semifluid CPU technology that surpassed the best thin-film computers made by the Israelis. But it was a quirky technology, just a few years old, subject to problems that conventional computers didn’t have, and still not entirely understood. Even less settled was whether experts based on this technology could finally be considered to be true AI. The superconducting gel that was the basis of the semifluid CPU was more alive than not, and the computer was largely self-determining once the projected energy matrix surrounding the gel was initiated by another computer. Building on the initial subsistence program, the computer would learn how to refine and control the matrix to improve its own ‘thinking’. The thin-film computers had long since passed the Turing test, and these semifluid systems seemed to be almost human. But did that constitute sentience? Jon considered it to be a moot point, of interest only to philosophers and ethicists.

In the world of 2052, when Communion is set, these “experts” are ubiquitous and extremely helpful. Seth is an “S-series”, the latest tech, and all S-series models have names which start with S. I figured that naming convention would be a nice way to track the development of such expert-systems technology, and in the course of the book you see earlier models which have appropriate names.

So when the time came to write St Cybi’s Well, I figured that I would introduce the first such model, named Andi. Here’s the first bit of dialog with Andi:

“Hi, I’m Andi, your assistant application. How can I help you?”

“Andi, check local restaurant reviews for Conwy and find the best ranked Fish & Chips place.”

“You’re not in Conwy. You’re in Holywell. Would you rather that I check restaurants where you are?”

“No, I’m not hungry yet. But I will be when I get to Conwy.”

“Very good. Shall I read off the names?”

“Not now. It can wait until I am closer.”

“Very good. Shall I track your movement and alert you?”

“No.”

“Very good. May I help you with something else?”

“Not right now.” Darnell shut off the app, then the phone, and dropped it back into his pocket. The walk back to his car was uneventful.

Now, I wrote this bit almost eight years ago, long before “Siri” or “Alexa” were announced. But it was predictable that such technology would soon be introduced, and I was amused as all get-out when Amazon decided to name their first assistant as “Alexa”.

Anyway, I also figured that since the technology would be new, and unsophisticated, that Andi would be slightly annoying to use. Because it would default to repetitions of scripts, be easy to confuse, et cetera, similar to encountering a ‘bot on a phone call. And you can judge for yourself, but I think I succeeded in the book — the readers of early chapters thought so, and commented on it.

So this article in the morning Washington Post made me chuckle:

Alexa, just shut up: We’ve been isolated for months, and now we hate our home assistants

“I’m not a bad person,” Angela Hatem said. “I’m so nice to people.” But Alexa, the voice of Amazon’s home devices, isn’t a person, despite how hard “she” tries to emulate one. And coronavirus/self-quarantine/2020 has Hatem feeling a bit stressed out.

“I say things to Alexa that I wouldn’t say to my worst enemy, if I had one. And I don’t know why. She makes me crazy. … I curse at her. I call her names. I’m very, very mean to her,” said Hatem, who lives in Indianapolis with her 1-year-old son. “There’s really few things I can vent at or vent to, and I’m making Alexa my virtual punching bag.”

 

Heh. Nailed another prediction.

* * *

It’s the first of the month. That means that both novels and our care-giving memoir are available for free download, as they are the first of each month. If you haven’t already, please help yourself and tell your friends.

Jim Downey

 

 



Daisy, Daisy …

One of the things I’ve been a little bit surprised by has been just how many people have volunteered to me (or in reviews) just how much they like the ‘Experts’ in Communion of Dreams, and in particular how much of a favorite character Seth becomes to them in the course of the novel.

I don’t mean I’m surprised by how much people like the Experts, and particularly Seth. Hell, I intended the Experts to be likeable. I mean that this is something which people find remarkable enough to, well, remark on it.

That’s because humans tend to anthropomorphize just about everything. Our pets. Our cars. Our tools. Even nature. It’s one of the basic ways that we make sense of the world, as can be seen in religious and spiritual beliefs.  Long before Siri there was HAL, and inasmuch as Communion of Dreams is an homage to 2001: A Space Odyssey I knew that Seth would resonate as a ‘real person’.*

So this morning I was amused to hear a story on NPR about how giving computers/robots more human characteristics tends to cause humans to develop a greater sense of empathy and socialization with them. Amused, but not surprised. From the article:

Many people have studied machine-human relations, and at this point it’s clear that without realizing it, we often treat the machines around us like social beings.

Consider the work of Stanford professor Clifford Nass. In 1996, he arranged a series of experiments testing whether people observe the rule of reciprocity with machines.

* * *

What the study demonstrated was that people do in fact obey the rule of reciprocity when it comes to computers. When the first computer was helpful to people, they helped it way more on the boring task than the other computer in the room. They reciprocated.

* * *

“The relationship is profoundly social,” he says. “The human brain is built so that when given the slightest hint that something is even vaguely social, or vaguely human — in this case, it was just answering questions; it didn’t have a face on the screen, it didn’t have a voice — but given the slightest hint of humanness, people will respond with an enormous array of social responses including, in this case, reciprocating and retaliating.”

 

On the NPR website version of the story there’s also this delightful video showing what happens when a robot with cat/human characteristics begs a research subject to not switch it off:

 

Interesting. But again, unsurprising. Consider the whole sequence in 2001: A Space Odyssey when HAL is shut down — a powerful and poignant part of the movie. And referenced at the end of the video above.

Lastly, I laughed out loud once the story was over on NPR, and the transitional bit of music started up. Why? Because it was an instrumental work by the artist Vangelis, composed as the Love Theme from the movie Blade Runner.

Hilarious.

 

Jim Downey

*And for those who have read the book, consider what the role of Chu Ling’s devas are relative to Seth … 😉  We’ll see more of this reference in St. Cybi’s Well.