Communion Of Dreams

This ‘n that.

Several things of interest, some personal, some news, some related to the book . . .

* * *

I am struck with how powerful just random chance is in determining the course of events. Whether you agree with the Administration’s handling of it or not, just consider how the BP oil leak in the Gulf has come to dominate the attention and course of politics. Who could have predicted that of all the things happening in the world, this would happen? It’s like getting in a car crash – it sort of shuts out every other factor in your life.

* * *

A couple of people have sent me a link to the NYT item “Merely Human? That’s so yesterday.” It’s a long piece, and worth reading.

I’ve written about Ray Kurzweil and the Singularity previously. Simply put, I find the idea interesting but unconvincing. Kurzweil and the others involved in this ‘Singularity University’ are smart people, and I like that they are pushing for research and the development of technology which will benefit all, but it strikes me as mostly as the technological equivalent of the ‘afterlife’ of most religions – more hope than reality. This quote from the article sums up my thoughts pretty well:

William S. Bainbridge, who has spent the last two decades evaluating grant proposals for the National Science Foundation, also sides with the skeptics.

“We are not seeing exponential results from the exponential gains in computing power,” he says. “I think we are at a time where progress will be increasingly difficult in many fields.

“We should not base ideas of the world on simplistic extrapolations of what has happened in the past,” he adds.

It’s called the Law of Diminishing Returns.

* * *

Which isn’t to say that there cannot be revolutionary breakthroughs which could radically change our lives. I’ve also written about how hydrogen sulfide (H2S) seems to be connected to hibernation, and now comes a fairly breathtaking bit of news that is related:

Mystery Explained: How Frozen Humans Are Brought Back

Yeast and worms can survive hypothermia if they are first subjected to extreme oxygen deprivation, a new study finds.

The results could explain a long-held mystery as to how humans can be brought back to life after “freezing to death,” the scientists say.

The study uncovered a previously unknown ability of organisms to survive lethal cold by temporarily slowing the biological processes that maintain life.

But the really interesting bit was this:

Documented cases of humans successfully revived after spending hours or days without a pulse in extremely cold conditions first inspired Roth to study the relationship between human hypothermia and his own research in forced hibernation.

In the winter of 2001, the body temperature of Canadian toddler Erica Norby plunged to 61 degrees Fahrenheit (16 degrees Celsius) as she lay for hours in below-freezing weather after wandering outside wearing only a diaper. Apparently dead, she recovered completely after being re-warmed and resuscitated.

The same curious fate befell Japanese mountain climber Mitsutaka Uchikoshi in 2006, who was discovered with a core body temperature of 71 degrees F (22 degrees C) after 23 days after falling asleep on a snowy mountain.

23 DAYS? Holy shit, I hadn’t been aware of that.

* * *

And lastly, you probably heard about this:

KABUL, Afghanistan – U.S. geologists have discovered vast mineral wealth in Afghanistan, possibly amounting to $1 trillion, President Hamid Karzai’s spokesman said Monday.

Waheed Omar told reporters the findings were made by the U.S. Geological Survey under contract to the Afghan government.

* * *

Americans discovered nearly $1 trillion in untapped mineral deposits in Afghanistan, including iron, copper, cobalt, gold and critical industrial metals like lithium, according to the report. The Times quoted a Pentagon memo as saying Afghanistan could become the “Saudi Arabia of lithium,” a key raw material in the manufacture of batteries for laptops and cell phones.

Sounds like a brilliant bit of good news? Think about it again. As someone on MetaFilter commented:

Oh man, I wish I could feel optimistic about this… but a homeless guy with no hope and no prospects, who finds a gold watch, still has no hope and no prospects, but now he’s in for a beating too.

Did you ever read The Prize? Same thing. The ore sources for some of these minerals are very rare, they are critical for many high-tech products, and there is going to be a scramble to make sure who winds up in control of them.

* * *

Random chance rules our lives.

Jim Downey


When I went away to college in 1976, I took with me the small black & white television I had received for my eighth birthday. Mostly my roommates and I would watch The Muppet Show before going off to dinner. Otherwise, I really didn’t have the time for television – there was studying to do, drugs and alcohol to abuse, sex to have.

Post college I had a massive old console color TV I had inherited. But given that I lived in Montezuma Iowa, reception was dismal. I found other things to do with my time, mostly SCA-related activities and gaming. I took that console set with me to graduate school in Iowa City, but it never really worked right, and besides I was still busy with SCA stuff and again with schoolwork.

For most of the ’90s I did watch some TV as it was being broadcast, but even then my wife and I preferred to time-shift using a VCR, skipping commercials and seeing the things we were interested in at times when it was convenient for us.

This century, living here and caring for someone with Alzheimer’s, we had to be somewhat more careful about selecting shows that wouldn’t contribute to Martha Sr’s confusion and agitation. Meaning mostly stuff we rented or movies/series we liked well enough to buy on DVD. I would now and then flip on the cable and skip around a bit after we got Martha Sr. to bed, see if there was anything interesting, but for the most part I relied on friends recommending stuff. And besides, I was busy working on Communion of Dreams, or blogging here or there, or writing a newspaper column or whatever.

Now-a-days we don’t even have cable. There’s just no reason to pay for it. I’d much rather get my news and information online. So, basically, I have missed most every television show and special event in the last thirty years. There are vast swaths of cultural reference I only know by inference, television shows that “define” American values I’ve never seen. I don’t miss it.

And you know what? You are becoming like me, more and more all the time.

* * * * * * *

Via Cory Doctorow at BoingBoing, this very interesting piece by

Gin, Television, and Social Surplus

* * *

If I had to pick the critical technology for the 20th century, the bit of social lubricant without which the wheels would’ve come off the whole enterprise, I’d say it was the sitcom. Starting with the Second World War a whole series of things happened–rising GDP per capita, rising educational attainment, rising life expectancy and, critically, a rising number of people who were working five-day work weeks. For the first time, society forced onto an enormous number of its citizens the requirement to manage something they had never had to manage before–free time.

And what did we do with that free time? Well, mostly we spent it watching TV.

We did that for decades. We watched I Love Lucy. We watched Gilligan’s Island. We watch Malcolm in the Middle. We watch Desperate Housewives. Desperate Housewives essentially functioned as a kind of cognitive heat sink, dissipating thinking that might otherwise have built up and caused society to overheat.

And it’s only now, as we’re waking up from that collective bender, that we’re starting to see the cognitive surplus as an asset rather than as a crisis. We’re seeing things being designed to take advantage of that surplus, to deploy it in ways more engaging than just having a TV in everybody’s basement.

OK, I try and be very careful about “fair use” of other people’s work, limiting myself to just a couple of paragraphs from a given article or blog post in order to make a point. But while I say that you should go read his whole post, I’m going to use another passage from Shirky here:

Did you ever see that episode of Gilligan’s Island where they almost get off the island and then Gilligan messes up and then they don’t? I saw that one. I saw that one a lot when I was growing up. And every half-hour that I watched that was a half an hour I wasn’t posting at my blog or editing Wikipedia or contributing to a mailing list. Now I had an ironclad excuse for not doing those things, which is none of those things existed then. I was forced into the channel of media the way it was because it was the only option. Now it’s not, and that’s the big surprise. However lousy it is to sit in your basement and pretend to be an elf, I can tell you from personal experience it’s worse to sit in your basement and try to figure if Ginger or Mary Ann is cuter.

And I’m willing to raise that to a general principle. It’s better to do something than to do nothing. Even lolcats, even cute pictures of kittens made even cuter with the addition of cute captions, hold out an invitation to participation. When you see a lolcat, one of the things it says to the viewer is, “If you have some fancy sans-serif fonts on your computer, you can play this game, too.” And that message–I can do that, too–is a big change.

It is a huge change. It is the difference between passively standing/sitting by and watching, and doing the same thing yourself. Whether it is sports, or sex, or politics, or art – doing it yourself means making better use of the limited time you have in this life.

* * * * * * *

And now, the next component of my little puzzle this morning.

Via MeFi, this NYT essay about the explosion of authorship:

You’re an Author? Me Too!

It’s well established that Americans are reading fewer books than they used to. A recent report by the National Endowment for the Arts found that 53 percent of Americans surveyed hadn’t read a book in the previous year — a state of affairs that has prompted much soul-searching by anyone with an affection for (or business interest in) turning pages. But even as more people choose the phantasmagoria of the screen over the contemplative pleasures of the page, there’s a parallel phenomenon sweeping the country: collective graphomania.

In 2007, a whopping 400,000 books were published or distributed in the United States, up from 300,000 in 2006, according to the industry tracker Bowker, which attributed the sharp rise to the number of print-on-demand books and reprints of out-of-print titles. University writing programs are thriving, while writers’ conferences abound, offering aspiring authors a chance to network and “workshop” their work. The blog tracker Technorati estimates that 175,000 new blogs are created worldwide each day (with a lucky few bloggers getting book deals). And the same N.E.A. study found that 7 percent of adults polled, or 15 million people, did creative writing, mostly “for personal fulfillment.”

* * *

Mark McGurl, an associate professor of English at the University of California, Los Angeles, and the author of a forthcoming book on the impact of creative writing programs on postwar American literature, agrees that writing programs have helped expand the literary universe. “American literature has never been deeper and stronger and more various than it is now,” McGurl said in an e-mail message. Still, he added, “one could put that more pessimistically: given the manifold distractions of modern life, we now have more great writers working in the United States than anyone has the time or inclination to read.”

An interesting discussion about this happens in that thread at Meta Filter. John Scalzi, no stranger at all to the world of blogging and online publishing, says this there:

I see nothing but upside in people writing and self-publishing, especially now that companies like Lulu make it easy for them to do so without falling prey to avaricious vanity presses. People who self-publish are in love with the idea of writing, and in love with the idea of books. Both are good for me personally, and good for the idea of a literate society moving forward.

Indeed. And it is pretty clearly a manifestation of what Shirky is talking about above.

I’ve written only briefly about my thoughts on the so-called Singularity – that moment when our technological abilities converge to create a new transcendent artificial intelligence which encompasses humanity in a collective awareness. As envisioned by the Singularity Institute and a number of Science Fiction authors, I think that it is too simple – too utopian. Life is more complex than that. Society develops and copes with change in odd and unpredictable ways, with good and bad and a whole lot in the middle.

For years, people have bemoaned how the developing culture of the internet is changing for the worse aspects of life. Newspapers are struggling. There’s the whole “Cult of the Amateur” nonsense. Just this morning on NPR there was a comment from a listener about how “blogs are just gossip”, in reaction to the new Sunday Soapbox political blog WESun has launched. And there is a certain truth to the complaints and hand-wringing. Maybe we just need to see this in context, though – that the internet is just one aspect of our changing culture, something which is shifting us away from being purely observers of the complex and confusing world around us, to being participants to a greater degree.

Sure, a lot of what passes for participation is fairly pointless, time-consuming crap in its own right. I am reminded of this brilliant xkcd strip. The activity itself is little better than just watching reruns of Gilligan’s Island or Seinfeld or whatever. But the *act* of participating is empowering, and instructive, and just plain good exercise – preparing the participant for being more involved, more in control of their own life and world.

We learn by doing. And if, by doing, we escape the numbing effects of being force-fed pablum from the television set for even a little while, that’s good. What if our Singularity is not a technological one, but a social one? What if, as people become more active, less passive, we actually learn to tap into the collective intelligence of humankind – not as a hive mind, but as something akin to an ideal Jeffersonian Democracy, updated to reflect the reality of modern culture?

I think we could do worse.

Jim Downey

Learning curve.

As I’ve said before, I’m a late-adopter of tech. I’m probably the last person in the US under the age of fifty and with an IQ above room temp who has made the transition over to Firefox.

Oh, it’s not as bad as it sounds – I’ve been running Mozilla for several years, and Netscape in one variety or another before that, all the way back to when I first got online in about ’93. But with the additional options available in Firefox2, it made sense to make the jump. So, with my good lady wife’s help (she’s the resident geek, not me) I switched yesterday, and then spent much of the rest of the day enjoying the much improved surfing experience, tweaking the set-up, learning the little quirks of the new software.

And also teaching it my own preferences and habits. This was the bit that I found amusing – that in one sense, I’m teaching Seth’s great-whatever-grandpappy his ABCs. Oh, we’re about 30 iterations of Moore’s Law away from the S-Series A.I. I have in Communion of Dreams, and a couple of computer ‘generations’ (if you consider that we’re currently in the fourth generation, that quantum computing will be the fifth, with my Tholin gel tech following that.) But it really does feel like something akin to a baby expert system I’m dealing with here, as we learn from one another.

I still don’t expect that we’ll experience a true Singularity such as Kurzweil and others have predicted, and the novel is in large part an exploration of why that is. But it is certainly the case that we’re moving towards a major threshold of technological change at an ever-increasing rate. Even late-adopters like me.

Jim Downey

What happens after?

A good friend of mine, who is a big science fiction fan, read an early version of Communion of Dreams and loved it, providing me some valuable feedback and support.  And he was *really* excited when he heard that I was going to write more in the same ‘universe’ as the book, wanting to know what happens after the events portrayed in Communion.  When I told him that I would be working on a prequel to the book rather than a sequel, he was disappointed.  “But I wanted to know what happens after the Singularity!” he protested.

[Mild Spoiler Alert]

As you are probably aware, the notion of a technological Singularity occuring, when we create the first true artificial intelligence which is superior to human intelligence, has been a popular one in SF for some time, and actually took on the term Singularity following coinage (I think) by Vernor Vinge.  In many ways, Communion of Dreams is my take on that moment when humankind crosses this threshhold, embodied in the character of Seth, the expert system who makes this transition.

The folks over at the Singularity Institute are working towards this goal, and wanting to help us prepare for it.  Cory Doctorow has a brief blog entry up at BoingBoing this morning about his experience speaking at the Singularity Summit hosted by Ray Kurzweil at Stanford last year, along with links to some vids of that event now hosted at the Institute.  It is worth a look.

I am intrigued by the notion of a technological Singularity, but think that it is fundamentally impossible for us to know what happens after such an event has matured.   Oh, sure, there’s good reason to speculate, and it is rich and fertile ground for planting ideas as an author, but…

…but I think that in many ways, leaving Communion as the end-point perhaps makes the most sense.  It is analogous to ending a book with the death of the character from whom everything is presented as a first-person account.  Because just as we do not know what happens after death, we do not know what happens after an event such as a technological Singularity.  For, in some very real ways, the same kind of transcendence will take place.

Jim Downey