Communion Of Dreams


This ‘n that.

Several things of interest, some personal, some news, some related to the book . . .

* * *

I am struck with how powerful just random chance is in determining the course of events. Whether you agree with the Administration’s handling of it or not, just consider how the BP oil leak in the Gulf has come to dominate the attention and course of politics. Who could have predicted that of all the things happening in the world, this would happen? It’s like getting in a car crash – it sort of shuts out every other factor in your life.

* * *

A couple of people have sent me a link to the NYT item “Merely Human? That’s so yesterday.” It’s a long piece, and worth reading.

I’ve written about Ray Kurzweil and the Singularity previously. Simply put, I find the idea interesting but unconvincing. Kurzweil and the others involved in this ‘Singularity University’ are smart people, and I like that they are pushing for research and the development of technology which will benefit all, but it strikes me as mostly as the technological equivalent of the ‘afterlife’ of most religions – more hope than reality. This quote from the article sums up my thoughts pretty well:

William S. Bainbridge, who has spent the last two decades evaluating grant proposals for the National Science Foundation, also sides with the skeptics.

“We are not seeing exponential results from the exponential gains in computing power,” he says. “I think we are at a time where progress will be increasingly difficult in many fields.

“We should not base ideas of the world on simplistic extrapolations of what has happened in the past,” he adds.

It’s called the Law of Diminishing Returns.

* * *

Which isn’t to say that there cannot be revolutionary breakthroughs which could radically change our lives. I’ve also written about how hydrogen sulfide (H2S) seems to be connected to hibernation, and now comes a fairly breathtaking bit of news that is related:

Mystery Explained: How Frozen Humans Are Brought Back

Yeast and worms can survive hypothermia if they are first subjected to extreme oxygen deprivation, a new study finds.

The results could explain a long-held mystery as to how humans can be brought back to life after “freezing to death,” the scientists say.

The study uncovered a previously unknown ability of organisms to survive lethal cold by temporarily slowing the biological processes that maintain life.

But the really interesting bit was this:

Documented cases of humans successfully revived after spending hours or days without a pulse in extremely cold conditions first inspired Roth to study the relationship between human hypothermia and his own research in forced hibernation.

In the winter of 2001, the body temperature of Canadian toddler Erica Norby plunged to 61 degrees Fahrenheit (16 degrees Celsius) as she lay for hours in below-freezing weather after wandering outside wearing only a diaper. Apparently dead, she recovered completely after being re-warmed and resuscitated.

The same curious fate befell Japanese mountain climber Mitsutaka Uchikoshi in 2006, who was discovered with a core body temperature of 71 degrees F (22 degrees C) after 23 days after falling asleep on a snowy mountain.

23 DAYS? Holy shit, I hadn’t been aware of that.

* * *

And lastly, you probably heard about this:

KABUL, Afghanistan – U.S. geologists have discovered vast mineral wealth in Afghanistan, possibly amounting to $1 trillion, President Hamid Karzai’s spokesman said Monday.

Waheed Omar told reporters the findings were made by the U.S. Geological Survey under contract to the Afghan government.

* * *

Americans discovered nearly $1 trillion in untapped mineral deposits in Afghanistan, including iron, copper, cobalt, gold and critical industrial metals like lithium, according to the report. The Times quoted a Pentagon memo as saying Afghanistan could become the “Saudi Arabia of lithium,” a key raw material in the manufacture of batteries for laptops and cell phones.

Sounds like a brilliant bit of good news? Think about it again. As someone on MetaFilter commented:

Oh man, I wish I could feel optimistic about this… but a homeless guy with no hope and no prospects, who finds a gold watch, still has no hope and no prospects, but now he’s in for a beating too.

Did you ever read The Prize? Same thing. The ore sources for some of these minerals are very rare, they are critical for many high-tech products, and there is going to be a scramble to make sure who winds up in control of them.

* * *

Random chance rules our lives.

Jim Downey



The memory remains.

Just now, my good lady wife was through to tell me that she’s off to take a bit of a nap. Both of us are getting over a touch of something (which I had mentioned last weekend), and on a deeper level still recovering from the profound exhaustion of having been care-givers for her mom.

Anyway, as she was preparing to head off, one of our cats insisted on going through the door which leads from my office into my bindery. This is where the cat food is.

“She wants through.”

“She wants owwwwt.”

“Any door leads out, as far as a cat is concerned.”

“Well, that door did once actually lead out, decades ago.”

“She remembers.”

“She can’t remember.”

“Nonetheless, the memory lingers.”

* * * * * * *

Via TDG, a fascinating interview with Douglas Richard Hofstadter last year, now translated into English. I’d read his GEB some 25 years ago, and have more or less kept tabs on his work since. The interview was about his most recent book, and touched on a number of subjects of interest to me, including the nature of consciousness, writing, Artificial Intelligence, and the Singularity. It’s long, but well worth the effort.

In discussing consciousness (which Hofstadter calls ‘the soul’ for reasons he explains), and the survival of shards of a given ‘soul’, the topic of writing and music comes up. Discussing how Chopin’s music has enabled shards of the composer’s soul to persist, Hofstadter makes this comment about his own desire to write:

I am not shooting at immortality through my books, no. Nor do I think Chopin was shooting at immortality through his music. That strikes me as a very selfish goal, and I don’t think Chopin was particularly selfish. I would also say that I think that music comes much closer to capturing the essence of a composer’s soul than do a writer’s ideas capture the writer’s soul. Perhaps some very emotional ideas that I express in my books can get across a bit of the essence of my soul to some readers, but I think that Chopin’s music probably does a lot better job (and the same holds, of course, for many composers).

I personally don’t have any thoughts about “shooting for immortality” when I write. I try to write simply in order to get ideas out there that I believe in and find fascinating, because I’d like to let other people be able share those ideas. But intellectual ideas alone, no matter how fascinating they are, are not enough to transmit a soul across brains. Perhaps, as I say, my autobiographical passages — at least some of them — get tiny shards of my soul across to some people.

Exactly.

* * * * * * *

In April, I wrote this:

I’ve written only briefly about my thoughts on the so-called Singularity – that moment when our technological abilities converge to create a new transcendent artificial intelligence which encompasses humanity in a collective awareness. As envisioned by the Singularity Institute and a number of Science Fiction authors, I think that it is too simple – too utopian. Life is more complex than that. Society develops and copes with change in odd and unpredictable ways, with good and bad and a whole lot in the middle.

Here’s Hofstadter’s take from the interview, in responding to a question about Ray Kurzweil‘s notion of achieving effective immortality by ‘uploading’ a personality into a machine hardware:

Well, the problem is that a soul by itself would go crazy; it has to live in a vastly complex world, and it has to cohabit that world with many other souls, commingling with them just as we do here on earth. To be sure, Kurzweil sees those things as no problem, either — we’ll have virtual worlds galore, “up there” in Cyberheaven, and of course there will be souls by the barrelful all running on the same hardware. And Kurzweil sees the new software souls as intermingling in all sorts of unanticipated and unimaginable ways.

Well, to me, this “glorious” new world would be the end of humanity as we know it. If such a vision comes to pass, it certainly would spell the end of human life. Once again, I don’t want to be there if such a vision should ever come to pass. But I doubt that it will come to pass for a very long time. How long? I just don’t know. Centuries, at least. But I don’t know. I’m not a futurologist in the least. But Kurzweil is far more “optimistic” (i.e., depressingly pessimistic, from my perspective) about the pace at which all these world-shaking changes will take place.

Interesting.

* * * * * * *

Lastly, the interview is about the central theme of I am a Strange Loop: that consciousness is an emergent phenomenon which stems from vast and subtle physical mechanisms in the brain. This is also the core ‘meaning’ of GEB, though that was often missed by readers and reviewers who got hung up on the ostensible themes, topics, and playfulness of that book. Hofstadter calls this emergent consciousness a self-referential hallucination, and it reflects much of his interest in cognitive science over the years.

[Mild spoilers ahead.]

In Communion of Dreams I played with this idea and a number of related ones, particularly pertaining to the character of Seth. It is also why I decided that I needed to introduce a whole new technology – based on the superfluid tholin-gel found on Titan, as the basis for the AI systems at the heart of the story. Because the gel is not human-manufactured, but rather something a bit mysterious. Likewise, the use of this material requires another sophisticated computer to ‘boot it up’, and then it itself is responsible for sustaining the energy matrix necessary for continued operation. At the culmination of the story, this ‘self-referential hallucination’ frees itself from its initial containment.

Why did I do this?

Partly in homage to Hofstedter (though you will find no mention of him in the book, as far as I recall). Partly because it plays with other ideas I have about the nature of reality. If we (conscious beings) are an emergent phenomenon, arising from physical activity, then it seems to me that physical things can be impressed with our consciousness. This is why I find his comments about shards of a soul existing beyond the life of the body of the person to be so intriguing.

So I spent some 130,000 words exploring that idea in Communion. Not overtly – not often anyway – but that is part of the subtext of what is going on in that book.

* * * * * * *

“Any door leads out, as far as a cat is concerned.”

“Well, that door did once actually lead out, decades ago.”

“She remembers.”

“She can’t remember.”

“Nonetheless, the memory lingers,” I said, “impressed on the door itself. Maybe the cat understands that at a level we don’t.”

Jim Downey

(Related post at UTI.)



Convergence.

When I went away to college in 1976, I took with me the small black & white television I had received for my eighth birthday. Mostly my roommates and I would watch The Muppet Show before going off to dinner. Otherwise, I really didn’t have the time for television – there was studying to do, drugs and alcohol to abuse, sex to have.

Post college I had a massive old console color TV I had inherited. But given that I lived in Montezuma Iowa, reception was dismal. I found other things to do with my time, mostly SCA-related activities and gaming. I took that console set with me to graduate school in Iowa City, but it never really worked right, and besides I was still busy with SCA stuff and again with schoolwork.

For most of the ’90s I did watch some TV as it was being broadcast, but even then my wife and I preferred to time-shift using a VCR, skipping commercials and seeing the things we were interested in at times when it was convenient for us.

This century, living here and caring for someone with Alzheimer’s, we had to be somewhat more careful about selecting shows that wouldn’t contribute to Martha Sr’s confusion and agitation. Meaning mostly stuff we rented or movies/series we liked well enough to buy on DVD. I would now and then flip on the cable and skip around a bit after we got Martha Sr. to bed, see if there was anything interesting, but for the most part I relied on friends recommending stuff. And besides, I was busy working on Communion of Dreams, or blogging here or there, or writing a newspaper column or whatever.

Now-a-days we don’t even have cable. There’s just no reason to pay for it. I’d much rather get my news and information online. So, basically, I have missed most every television show and special event in the last thirty years. There are vast swaths of cultural reference I only know by inference, television shows that “define” American values I’ve never seen. I don’t miss it.

And you know what? You are becoming like me, more and more all the time.

* * * * * * *

Via Cory Doctorow at BoingBoing, this very interesting piece by

Gin, Television, and Social Surplus

* * *

If I had to pick the critical technology for the 20th century, the bit of social lubricant without which the wheels would’ve come off the whole enterprise, I’d say it was the sitcom. Starting with the Second World War a whole series of things happened–rising GDP per capita, rising educational attainment, rising life expectancy and, critically, a rising number of people who were working five-day work weeks. For the first time, society forced onto an enormous number of its citizens the requirement to manage something they had never had to manage before–free time.

And what did we do with that free time? Well, mostly we spent it watching TV.

We did that for decades. We watched I Love Lucy. We watched Gilligan’s Island. We watch Malcolm in the Middle. We watch Desperate Housewives. Desperate Housewives essentially functioned as a kind of cognitive heat sink, dissipating thinking that might otherwise have built up and caused society to overheat.

And it’s only now, as we’re waking up from that collective bender, that we’re starting to see the cognitive surplus as an asset rather than as a crisis. We’re seeing things being designed to take advantage of that surplus, to deploy it in ways more engaging than just having a TV in everybody’s basement.

OK, I try and be very careful about “fair use” of other people’s work, limiting myself to just a couple of paragraphs from a given article or blog post in order to make a point. But while I say that you should go read his whole post, I’m going to use another passage from Shirky here:

Did you ever see that episode of Gilligan’s Island where they almost get off the island and then Gilligan messes up and then they don’t? I saw that one. I saw that one a lot when I was growing up. And every half-hour that I watched that was a half an hour I wasn’t posting at my blog or editing Wikipedia or contributing to a mailing list. Now I had an ironclad excuse for not doing those things, which is none of those things existed then. I was forced into the channel of media the way it was because it was the only option. Now it’s not, and that’s the big surprise. However lousy it is to sit in your basement and pretend to be an elf, I can tell you from personal experience it’s worse to sit in your basement and try to figure if Ginger or Mary Ann is cuter.

And I’m willing to raise that to a general principle. It’s better to do something than to do nothing. Even lolcats, even cute pictures of kittens made even cuter with the addition of cute captions, hold out an invitation to participation. When you see a lolcat, one of the things it says to the viewer is, “If you have some fancy sans-serif fonts on your computer, you can play this game, too.” And that message–I can do that, too–is a big change.

It is a huge change. It is the difference between passively standing/sitting by and watching, and doing the same thing yourself. Whether it is sports, or sex, or politics, or art – doing it yourself means making better use of the limited time you have in this life.

* * * * * * *

And now, the next component of my little puzzle this morning.

Via MeFi, this NYT essay about the explosion of authorship:

You’re an Author? Me Too!

It’s well established that Americans are reading fewer books than they used to. A recent report by the National Endowment for the Arts found that 53 percent of Americans surveyed hadn’t read a book in the previous year — a state of affairs that has prompted much soul-searching by anyone with an affection for (or business interest in) turning pages. But even as more people choose the phantasmagoria of the screen over the contemplative pleasures of the page, there’s a parallel phenomenon sweeping the country: collective graphomania.

In 2007, a whopping 400,000 books were published or distributed in the United States, up from 300,000 in 2006, according to the industry tracker Bowker, which attributed the sharp rise to the number of print-on-demand books and reprints of out-of-print titles. University writing programs are thriving, while writers’ conferences abound, offering aspiring authors a chance to network and “workshop” their work. The blog tracker Technorati estimates that 175,000 new blogs are created worldwide each day (with a lucky few bloggers getting book deals). And the same N.E.A. study found that 7 percent of adults polled, or 15 million people, did creative writing, mostly “for personal fulfillment.”

* * *

Mark McGurl, an associate professor of English at the University of California, Los Angeles, and the author of a forthcoming book on the impact of creative writing programs on postwar American literature, agrees that writing programs have helped expand the literary universe. “American literature has never been deeper and stronger and more various than it is now,” McGurl said in an e-mail message. Still, he added, “one could put that more pessimistically: given the manifold distractions of modern life, we now have more great writers working in the United States than anyone has the time or inclination to read.”

An interesting discussion about this happens in that thread at Meta Filter. John Scalzi, no stranger at all to the world of blogging and online publishing, says this there:

I see nothing but upside in people writing and self-publishing, especially now that companies like Lulu make it easy for them to do so without falling prey to avaricious vanity presses. People who self-publish are in love with the idea of writing, and in love with the idea of books. Both are good for me personally, and good for the idea of a literate society moving forward.

Indeed. And it is pretty clearly a manifestation of what Shirky is talking about above.

I’ve written only briefly about my thoughts on the so-called Singularity – that moment when our technological abilities converge to create a new transcendent artificial intelligence which encompasses humanity in a collective awareness. As envisioned by the Singularity Institute and a number of Science Fiction authors, I think that it is too simple – too utopian. Life is more complex than that. Society develops and copes with change in odd and unpredictable ways, with good and bad and a whole lot in the middle.

For years, people have bemoaned how the developing culture of the internet is changing for the worse aspects of life. Newspapers are struggling. There’s the whole “Cult of the Amateur” nonsense. Just this morning on NPR there was a comment from a listener about how “blogs are just gossip”, in reaction to the new Sunday Soapbox political blog WESun has launched. And there is a certain truth to the complaints and hand-wringing. Maybe we just need to see this in context, though – that the internet is just one aspect of our changing culture, something which is shifting us away from being purely observers of the complex and confusing world around us, to being participants to a greater degree.

Sure, a lot of what passes for participation is fairly pointless, time-consuming crap in its own right. I am reminded of this brilliant xkcd strip. The activity itself is little better than just watching reruns of Gilligan’s Island or Seinfeld or whatever. But the *act* of participating is empowering, and instructive, and just plain good exercise – preparing the participant for being more involved, more in control of their own life and world.

We learn by doing. And if, by doing, we escape the numbing effects of being force-fed pablum from the television set for even a little while, that’s good. What if our Singularity is not a technological one, but a social one? What if, as people become more active, less passive, we actually learn to tap into the collective intelligence of humankind – not as a hive mind, but as something akin to an ideal Jeffersonian Democracy, updated to reflect the reality of modern culture?

I think we could do worse.

Jim Downey



Learning curve.

As I’ve said before, I’m a late-adopter of tech. I’m probably the last person in the US under the age of fifty and with an IQ above room temp who has made the transition over to Firefox.

Oh, it’s not as bad as it sounds – I’ve been running Mozilla for several years, and Netscape in one variety or another before that, all the way back to when I first got online in about ’93. But with the additional options available in Firefox2, it made sense to make the jump. So, with my good lady wife’s help (she’s the resident geek, not me) I switched yesterday, and then spent much of the rest of the day enjoying the much improved surfing experience, tweaking the set-up, learning the little quirks of the new software.

And also teaching it my own preferences and habits. This was the bit that I found amusing – that in one sense, I’m teaching Seth’s great-whatever-grandpappy his ABCs. Oh, we’re about 30 iterations of Moore’s Law away from the S-Series A.I. I have in Communion of Dreams, and a couple of computer ‘generations’ (if you consider that we’re currently in the fourth generation, that quantum computing will be the fifth, with my Tholin gel tech following that.) But it really does feel like something akin to a baby expert system I’m dealing with here, as we learn from one another.

I still don’t expect that we’ll experience a true Singularity such as Kurzweil and others have predicted, and the novel is in large part an exploration of why that is. But it is certainly the case that we’re moving towards a major threshold of technological change at an ever-increasing rate. Even late-adopters like me.

Jim Downey



“It might be life, Jim…”

“Grrrr.”

“Easy, Alwyn.”

“Grrrrr! GRR!” His growls grew from a distant throaty rumble into a near bark, as we came around the corner across from the lawn with the sprinkler. Yeah, my dog was growling at a lawn sprinkler. This is not normal behaviour for him.

But in fairness, it was an odd lawn sprinkler. A big plastic dog lawn sprinkler. White, with black spots. Looked vaguely like a St. Bernard in size and shape, but a Dalmation in coloration. The hose attached to the tail, which fanned water all over while doing this odd jitterbug wag. Looked like some overgrown kid’s toy. Which it might well be. Since I don’t have kid, I don’t keep track of these things.

Anyway, it was clear that my dog thought that it was some kind of bizzaro-dog with a serious bladder problem. Perhaps an Alien Zombie Dog or something. So, he did the natural thing: he growled.

* * * * * * * * * * * * * * * * * * *

As I’ve noted before, I’m a big fan of the original series Star Trek and of Gene Roddenberry. But one of the things which has always bothered me about that series and most other SF television or movies is the fact that so often the Aliens are depicted as some variation of humanoid, albeit with a little makeup and prosthetics as the budget would allow. Though, in fairness to Roddenberry (and others in different series now and then), sometimes there was an attempt made to depict alien life as being just completely odd, unlike anything we’ve known or seen. This notion that extraterrestrial life might be difficult to even identify is a staple of good Science Fiction, of course, and one of the topics which I explore at some length in Communion of Dreams (and part of the reason why we never meet the aliens responsible for the creation of the artifact). It gets back to “Haldane’s Law“:

Now my own suspicion is that the Universe is not only queerer than we suppose, but queerer than we CAN suppose.

(Which is decidedly similar to Sir Arthur Eddington‘s attributed comment: “Not only is the universe stranger than we imagine, it is stranger than we can imagine.” But since I am talking more about life here than astrophysics, I thought I’d go with the evolutionary biologist…)

But now actual science has perhaps caught up with Science Fiction. From the New Journal of Physics comes a paper discussing what seems to be the discovery of inorganic life. The abstract:

Abstract. Complex plasmas may naturally self-organize themselves into stable interacting helical structures that exhibit features normally attributed to organic living matter. The self-organization is based on non-trivial physical mechanisms of plasma interactions involving over-screening of plasma polarization. As a result, each helical string composed of solid microparticles is topologically and dynamically controlled by plasma fluxes leading to particle charging and over-screening, the latter providing attraction even among helical strings of the same charge sign. These interacting complex structures exhibit thermodynamic and evolutionary features thought to be peculiar only to living matter such as bifurcations that serve as `memory marks’, self-duplication, metabolic rates in a thermodynamically open system, and non-Hamiltonian dynamics. We examine the salient features of this new complex `state of soft matter’ in light of the autonomy, evolution, progenity and autopoiesis principles used to define life. It is concluded that complex self-organized plasma structures exhibit all the necessary properties to qualify them as candidates for inorganic living matter that may exist in space provided certain conditions allow them to evolve naturally.

That’s a bit dense, so let’s go to the critical bit from the Press Release:

‘It might be life, Jim…’, physicists discover inorganic dust with lifelike qualities.

Until now, physicists assumed that there could be little organisation in such a cloud of particles. However, Tsytovich and his colleagues demonstrated, using a computer model of molecular dynamics, that particles in a plasma can undergo self-organization as electronic charges become separated and the plasma becomes polarized. This effect results in microscopic strands of solid particles that twist into corkscrew shapes, or helical structures. These helical strands are themselves electronically charged and are attracted to each other.

Quite bizarrely, not only do these helical strands interact in a counterintuitive way in which like can attract like, but they also undergo changes that are normally associated with biological molecules, such as DNA and proteins, say the researchers. They can, for instance, divide, or bifurcate, to form two copies of the original structure. These new structures can also interact to induce changes in their neighbours and they can even evolve into yet more structures as less stable ones break down, leaving behind only the fittest structures in the plasma.

So, could helical clusters formed from interstellar dust be somehow alive? “These complex, self-organized plasma structures exhibit all the necessary properties to qualify them as candidates for inorganic living matter,” says Tsytovich, “they are autonomous, they reproduce and they evolve”.

Obviously, there’s more to it, and it is worth reading at least the entire press release, or the full paper if you have a chance.

* * * * * * * * * * * * * * * * * * *

There’s another possibility, of course. This one can best be summed up as being that life is “a dream within a dream“. The latest popular version of this is “The Matrix“, wherein life is an artificial reality construct, designed to keep the human ‘power cells’ docile. But this too is an idea extensively exploited in Science Fiction, with many different variations on the theme. Of late, this idea has been more and more tied to the concept of a ‘Singularity’ , with speculation being that we are just some version of post-human research/recreation as a computer construct. And in a piece published yesterday in the NYT titled “Our Lives, Controlled From Some Guy’s Couch” this gets the mainstream religion treatment:

Until I talked to Nick Bostrom, a philosopher at Oxford University, it never occurred to me that our universe might be somebody else’s hobby. I hadn’t imagined that the omniscient, omnipotent creator of the heavens and earth could be an advanced version of a guy who spends his weekends building model railroads or overseeing video-game worlds like the Sims.

But now it seems quite possible. In fact, if you accept a pretty reasonable assumption of Dr. Bostrom’s, it is almost a mathematical certainty that we are living in someone else’s computer simulation.

. . .

David J. Chalmers, a philosopher at the Australian National University, says Dr. Bostrom’s simulation hypothesis isn’t a cause for skepticism, but simply a different metaphysical explanation of our world. Whatever you’re touching now — a sheet of paper, a keyboard, a coffee mug — is real to you even if it’s created on a computer circuit rather than fashioned out of wood, plastic or clay.

You still have the desire to live as long as you can in this virtual world — and in any simulated afterlife that the designer of this world might bestow on you. Maybe that means following traditional moral principles, if you think the posthuman designer shares those morals and would reward you for being a good person.

* * * * * * * * * * * * * * * * * * *

My own prediction is that unless we are extremely fortunate, and extremely open-minded, we’ll stumble badly in our first encounter with any real extra-terrestrial intelligence. Chances are, we’ll completely mistake it for something else, or try and see it through our limited perspective, not unlike how my dog mistook a lawn sprinkler for a wierdly-colored St. Bernard. If we’re lucky, we’ll survive that first contact, and then go on to see the universe with less prejudiced eyes.

If we’re *very* lucky.

Jim Downey

(Some material via BoingBoing.)



What happens after?

A good friend of mine, who is a big science fiction fan, read an early version of Communion of Dreams and loved it, providing me some valuable feedback and support.  And he was *really* excited when he heard that I was going to write more in the same ‘universe’ as the book, wanting to know what happens after the events portrayed in Communion.  When I told him that I would be working on a prequel to the book rather than a sequel, he was disappointed.  “But I wanted to know what happens after the Singularity!” he protested.

[Mild Spoiler Alert]

As you are probably aware, the notion of a technological Singularity occuring, when we create the first true artificial intelligence which is superior to human intelligence, has been a popular one in SF for some time, and actually took on the term Singularity following coinage (I think) by Vernor Vinge.  In many ways, Communion of Dreams is my take on that moment when humankind crosses this threshhold, embodied in the character of Seth, the expert system who makes this transition.

The folks over at the Singularity Institute are working towards this goal, and wanting to help us prepare for it.  Cory Doctorow has a brief blog entry up at BoingBoing this morning about his experience speaking at the Singularity Summit hosted by Ray Kurzweil at Stanford last year, along with links to some vids of that event now hosted at the Institute.  It is worth a look.

I am intrigued by the notion of a technological Singularity, but think that it is fundamentally impossible for us to know what happens after such an event has matured.   Oh, sure, there’s good reason to speculate, and it is rich and fertile ground for planting ideas as an author, but…

…but I think that in many ways, leaving Communion as the end-point perhaps makes the most sense.  It is analogous to ending a book with the death of the character from whom everything is presented as a first-person account.  Because just as we do not know what happens after death, we do not know what happens after an event such as a technological Singularity.  For, in some very real ways, the same kind of transcendence will take place.

Jim Downey




Follow

Get every new post delivered to your Inbox.

Join 312 other followers