Communion Of Dreams


Sequel, sequel, who has a sequel?

I can’t say that I’ve gotten terribly excited about io9, the relatively new site that describes itself as “Strung Out on Science Fiction”.  Simply, so much of the content there seems directed at current TV shows that I’m not watching, it just doesn’t seem to make sense to plow through it all.

But every once in a while I’ll come across something posted elsewhere that links to io9, and will go take a look.  Like this piece, via MeFi:

7 Reasons Why Scifi Book Series Outstay Their Welcomes

Why do so many amazing novels sprawl into so-so trilogies? Let alone blah tetralogies, or dull ten-book series? Blame “Herbert’s Syndrome,” in which a great writer gets tempted to keep writing about a popular universe, like Frank Herbert’s Dune, long after its expiration date. (The Fantasy Review coined the term “Herbert’s Syndrome” back in 1984, so Brian Herbert didn’t enter into it.) Here’s a handy guide to the symptoms and causes of Herbert’s unfortunate ailment.

It’s a bit interesting to see what the author has to say on the subject.  But honestly, the discussion in the MeFi thread is more complete and insightful (which isn’t too surprising – a quick blog post is meant to provoke thought, not complete it).

I mention it because I often have people ask me whether I will be writing a ‘sequel’ to Communion of Dreams.  I think people naturally want to know ‘what happens next?’  But I like leaving the ambiguity where it is,  to make people wonder.

Which isn’t to say that I don’t plan on writing other books in the same ‘universe’ as Communion of Dreams.  I have mentioned previously that I have started St. Cybi’s Well, which is set at the time of the first outbreak of the Fireflu (about 2012 in that alternate time line).  As I recover from the last couple of years of being a care-giver, I will once again be returning to writing that book.  I also have an idea for a book set in the 2030s, in the Israeli colony on the Moon, which would feature an artist as the main character, but that is not very well developed yet.  It is possible that I could come up with other books which would fit within my alternate time-line, but I have no plans to just crank out a dozen books in such a series.  I respect those authors who have a single vision, a single story, which naturally plays out over the course of multiple books – but I have little respect or interest in those who just wish to cash in on a popular work.

Anyway, thought you might enjoy that discussion.

Jim Downey



“No, I’m from Iowa. I only work in outer space.”
May 12, 2008, 12:27 pm
Filed under: Gene Roddenberry, MetaFilter, SCA, Science Fiction, Society, Space, Star Trek

No, I haven’t been sleeping the last four days.

The weekend was entirely occupied in playing painter in my wife’s new office suite. When I wasn’t painting, I was either too tired or too sore to do much of anything else. I’m not used to that much physical activity. Pathetic.

And I can’t stop to play much here now, either – have conservation work that needs tending, having spent this morning cleaning up and after a sick dog. Too much fun.

But had to share this item from the Daily Mail, via MeFi – excerpts from William Shatner‘s latest memoir. Here’s one passage:

One morning, shooting a Star Trek movie in the desert, I had a very early call.

So I told the wardrobe girl: “Give me my uniform and I’ll put it on at the house so I don’t have to come in any earlier for wardrobe. I’ll just wear it to the set.”

So at 4am I was racing across the desert to our location. I was way over the speed limit, figuring there wasn’t another car on the road in the entire state.

It turned out there was one other car – and it had lights and a siren.

I got out of my car, dressed in my uniform. The police officer looked me up and down, frowned and asked: “So where are you going so fast at this time in the morning?”

I told him the truth: “To my spaceship.”

He sighed. “OK, go ahead,” he said, before adding the Vulcan blessing: “Live long and prosper.”

I haven’t read any of his other books (which surprises some friends, given how fond I am of the original series) and probably won’t read this one. But I thought it was a good anecdote.

And sometime when I have a little more time and remember to do so, I’ll have to write up the surreal story of attending the first “Trek Fest” in Riverside, Iowa over 20 years ago, celebrating the “future birthplace” of James T. Kirk. I did so as part of an SCA Demo (huh? What’s the connection to Star Trek?). Held in the back of a bar . . .

Jim Downey

(Extra points if you can name the source of the quote in the title. Not who – that’s obvious – but from where?)



Bit of a rough night.
May 6, 2008, 11:22 am
Filed under: Book Conservation, MetaFilter, Politics, Sleep, Society, Violence

See this post at UTI for details.  As a consequence, I didn’t sleep a whole lot.  But the most annoying part is past, I think, and I may nap this afternoon.

Anyway, via MeFi, here is an amazing site about the restoration of three ceramic vases destroyed in a museum accident.  It is a bit surprising just how many of the techniques used are analogous to what I use in book restoration (though usually I am not doing that level of work for my clients.)  Be sure to click the “interactive” selection.

Jim Downey



The Rule of Death
May 4, 2008, 8:51 am
Filed under: Art, Comics, Humor, MetaFilter

I’m not quite sure what to make of this new comic I stumbled upon via MeFi, but at least the start of the thing is rather interesting. Kind of a Zombie Western thing going, about a fellow who decides that he doesn’t really want to be dead. That this decision comes following his funeral is a bit of a problem for the local townsfolk . . .

Anyway, start with The Decision, and go from there. The Table of Contents lists six episodes so far.

Enjoy. Or not. Your decision.

Jim Downey

(Cross posted to UTI.)



Convergence.

When I went away to college in 1976, I took with me the small black & white television I had received for my eighth birthday. Mostly my roommates and I would watch The Muppet Show before going off to dinner. Otherwise, I really didn’t have the time for television – there was studying to do, drugs and alcohol to abuse, sex to have.

Post college I had a massive old console color TV I had inherited. But given that I lived in Montezuma Iowa, reception was dismal. I found other things to do with my time, mostly SCA-related activities and gaming. I took that console set with me to graduate school in Iowa City, but it never really worked right, and besides I was still busy with SCA stuff and again with schoolwork.

For most of the ’90s I did watch some TV as it was being broadcast, but even then my wife and I preferred to time-shift using a VCR, skipping commercials and seeing the things we were interested in at times when it was convenient for us.

This century, living here and caring for someone with Alzheimer’s, we had to be somewhat more careful about selecting shows that wouldn’t contribute to Martha Sr’s confusion and agitation. Meaning mostly stuff we rented or movies/series we liked well enough to buy on DVD. I would now and then flip on the cable and skip around a bit after we got Martha Sr. to bed, see if there was anything interesting, but for the most part I relied on friends recommending stuff. And besides, I was busy working on Communion of Dreams, or blogging here or there, or writing a newspaper column or whatever.

Now-a-days we don’t even have cable. There’s just no reason to pay for it. I’d much rather get my news and information online. So, basically, I have missed most every television show and special event in the last thirty years. There are vast swaths of cultural reference I only know by inference, television shows that “define” American values I’ve never seen. I don’t miss it.

And you know what? You are becoming like me, more and more all the time.

* * * * * * *

Via Cory Doctorow at BoingBoing, this very interesting piece by

Gin, Television, and Social Surplus

* * *

If I had to pick the critical technology for the 20th century, the bit of social lubricant without which the wheels would’ve come off the whole enterprise, I’d say it was the sitcom. Starting with the Second World War a whole series of things happened–rising GDP per capita, rising educational attainment, rising life expectancy and, critically, a rising number of people who were working five-day work weeks. For the first time, society forced onto an enormous number of its citizens the requirement to manage something they had never had to manage before–free time.

And what did we do with that free time? Well, mostly we spent it watching TV.

We did that for decades. We watched I Love Lucy. We watched Gilligan’s Island. We watch Malcolm in the Middle. We watch Desperate Housewives. Desperate Housewives essentially functioned as a kind of cognitive heat sink, dissipating thinking that might otherwise have built up and caused society to overheat.

And it’s only now, as we’re waking up from that collective bender, that we’re starting to see the cognitive surplus as an asset rather than as a crisis. We’re seeing things being designed to take advantage of that surplus, to deploy it in ways more engaging than just having a TV in everybody’s basement.

OK, I try and be very careful about “fair use” of other people’s work, limiting myself to just a couple of paragraphs from a given article or blog post in order to make a point. But while I say that you should go read his whole post, I’m going to use another passage from Shirky here:

Did you ever see that episode of Gilligan’s Island where they almost get off the island and then Gilligan messes up and then they don’t? I saw that one. I saw that one a lot when I was growing up. And every half-hour that I watched that was a half an hour I wasn’t posting at my blog or editing Wikipedia or contributing to a mailing list. Now I had an ironclad excuse for not doing those things, which is none of those things existed then. I was forced into the channel of media the way it was because it was the only option. Now it’s not, and that’s the big surprise. However lousy it is to sit in your basement and pretend to be an elf, I can tell you from personal experience it’s worse to sit in your basement and try to figure if Ginger or Mary Ann is cuter.

And I’m willing to raise that to a general principle. It’s better to do something than to do nothing. Even lolcats, even cute pictures of kittens made even cuter with the addition of cute captions, hold out an invitation to participation. When you see a lolcat, one of the things it says to the viewer is, “If you have some fancy sans-serif fonts on your computer, you can play this game, too.” And that message–I can do that, too–is a big change.

It is a huge change. It is the difference between passively standing/sitting by and watching, and doing the same thing yourself. Whether it is sports, or sex, or politics, or art – doing it yourself means making better use of the limited time you have in this life.

* * * * * * *

And now, the next component of my little puzzle this morning.

Via MeFi, this NYT essay about the explosion of authorship:

You’re an Author? Me Too!

It’s well established that Americans are reading fewer books than they used to. A recent report by the National Endowment for the Arts found that 53 percent of Americans surveyed hadn’t read a book in the previous year — a state of affairs that has prompted much soul-searching by anyone with an affection for (or business interest in) turning pages. But even as more people choose the phantasmagoria of the screen over the contemplative pleasures of the page, there’s a parallel phenomenon sweeping the country: collective graphomania.

In 2007, a whopping 400,000 books were published or distributed in the United States, up from 300,000 in 2006, according to the industry tracker Bowker, which attributed the sharp rise to the number of print-on-demand books and reprints of out-of-print titles. University writing programs are thriving, while writers’ conferences abound, offering aspiring authors a chance to network and “workshop” their work. The blog tracker Technorati estimates that 175,000 new blogs are created worldwide each day (with a lucky few bloggers getting book deals). And the same N.E.A. study found that 7 percent of adults polled, or 15 million people, did creative writing, mostly “for personal fulfillment.”

* * *

Mark McGurl, an associate professor of English at the University of California, Los Angeles, and the author of a forthcoming book on the impact of creative writing programs on postwar American literature, agrees that writing programs have helped expand the literary universe. “American literature has never been deeper and stronger and more various than it is now,” McGurl said in an e-mail message. Still, he added, “one could put that more pessimistically: given the manifold distractions of modern life, we now have more great writers working in the United States than anyone has the time or inclination to read.”

An interesting discussion about this happens in that thread at Meta Filter. John Scalzi, no stranger at all to the world of blogging and online publishing, says this there:

I see nothing but upside in people writing and self-publishing, especially now that companies like Lulu make it easy for them to do so without falling prey to avaricious vanity presses. People who self-publish are in love with the idea of writing, and in love with the idea of books. Both are good for me personally, and good for the idea of a literate society moving forward.

Indeed. And it is pretty clearly a manifestation of what Shirky is talking about above.

I’ve written only briefly about my thoughts on the so-called Singularity – that moment when our technological abilities converge to create a new transcendent artificial intelligence which encompasses humanity in a collective awareness. As envisioned by the Singularity Institute and a number of Science Fiction authors, I think that it is too simple – too utopian. Life is more complex than that. Society develops and copes with change in odd and unpredictable ways, with good and bad and a whole lot in the middle.

For years, people have bemoaned how the developing culture of the internet is changing for the worse aspects of life. Newspapers are struggling. There’s the whole “Cult of the Amateur” nonsense. Just this morning on NPR there was a comment from a listener about how “blogs are just gossip”, in reaction to the new Sunday Soapbox political blog WESun has launched. And there is a certain truth to the complaints and hand-wringing. Maybe we just need to see this in context, though – that the internet is just one aspect of our changing culture, something which is shifting us away from being purely observers of the complex and confusing world around us, to being participants to a greater degree.

Sure, a lot of what passes for participation is fairly pointless, time-consuming crap in its own right. I am reminded of this brilliant xkcd strip. The activity itself is little better than just watching reruns of Gilligan’s Island or Seinfeld or whatever. But the *act* of participating is empowering, and instructive, and just plain good exercise – preparing the participant for being more involved, more in control of their own life and world.

We learn by doing. And if, by doing, we escape the numbing effects of being force-fed pablum from the television set for even a little while, that’s good. What if our Singularity is not a technological one, but a social one? What if, as people become more active, less passive, we actually learn to tap into the collective intelligence of humankind – not as a hive mind, but as something akin to an ideal Jeffersonian Democracy, updated to reflect the reality of modern culture?

I think we could do worse.

Jim Downey



Eye, Robot.

I like bad science fiction movies. Cheesy special effects, bad dialog and worse acting, it doesn’t matter. Just so long as there is a nub of a decent idea in there somewhere, trying to get out.

And in that spirit, I added I, Robot to my NetFlix queue some time back, knowing full well that it had almost nothing to do with Isaac Asimov’s brilliant stories. I knew it was set in the near term future, and that it had been a success at the box office, but that was about it. This past weekend, it arrived. I watched it last night.

I think Asimov himself predicted just what would be wrong with this movie:

In the essay “The Boom in Science Fiction” (Isaac Asimov on Science Fiction, pp. 125–128), Asimov himself explained the reason for Hollywood’s overriding need for violence:

[…] Eye-sci-fi has an audience that is fundamentally different from that of science fiction. In order for eye-sci-fi to be profitable it must be seen by tens of millions of people; in order for science fiction to be profitable it need be read by only tens of thousands of people. This means that some ninety percent (perhaps as much as ninety-nine percent) of the people who go to see eye-sci-fi are likely never to have read science fiction.The purveyors of eye-sci-fi cannot assume that their audience knows anything about science, has any experience with the scientific imagination, or even has any interest in science fiction.

But, in that case, why should the purveyors of eye-sci-fi expect anyone to see the pictures? Because they intend to supply something that has no essential connection with science fiction, but that tens of millions of people are willing to pay money to see. What is that? Why, scenes of destruction.

Yup. And that is just about all that the movie I, Robot is – destruction and special effects. Shame, really, since I have enjoyed Will Smith in other bad SF (Independence Day, anyone?), and just love Alan Tudyk from Firefly/Serenity. Even what had to be intentional references to such excellent movies as Blade Runner or The Matrix fell completely flat. It was, in a word, dreadful.

Ah, well. Via MeFi, here’s a little gem to wash the bad taste out of your mouth:

Gene Roddenberry would be proud.

Jim Downey



Remember “Earthrise”?

That was the iconic photo taken during the Apollo 8 mission, widely considered to be one of the most beautiful, and touching, images ever. This video, titled “Cities at Night”, has something of that quality:

It is a series of images taken from the ISS, using an improvised barn-door tracking system to stabilize their digital cameras relative to the speed of the station, allowing for images good to a resolution of about 60 meters. And it had a similar effect on me from watching it as seeing “Earthrise” did for the first time (I remember that, back in 1968), even with my poor monitor and via YouTube.

Light pollution is a problem, as I have mention previously. But it is hard to look at these images and not be struck with just how beautiful even the evidence of our sprawl and overpopulation can be. And seeing our city lights from 200 miles up is inspirational, a glimpse in how we can indeed someday transcend our problems and limitations. We need not be Earthbound, not now, not for the future.

Jim Downey

(Via MeFi.)



But think of the convenience!

One of the basic premises of Communion of Dreams is that over time we will introduce personal ‘experts’ – advanced Expert Systems or Artificial Intelligence – which will act as a buffer between the individual and a technological world. We will enter into a trade-off: allow our ‘expert’ to function as an old-fashioned butler, knowing all of our secrets but guarding them closely, in order to then interact with the rest of the world. So, your expert would know your preferences on entertainment and books, handle your communications and banking, maintain some minimal privacy for you by being a “black box” which negotiates with other people and machines on your behalf.

Why do I think that this will happen? Why will it be necessary?

Because increasingly, in the name of ‘convenience’, both government and industry are seeking to become more intrusive in our lives, all the way down to the level of what happens inside our homes. People want the convenience, but are starting to become increasingly aware of what the price of the trade-off will be. The latest example:

Comcast Cameras to Start Watching You?

If you have some tinfoil handy, now might be a good time to fashion a hat. At the Digital Living Room conference today, Gerard Kunkel, Comcast’s senior VP of user experience, told me the cable company is experimenting with different camera technologies built into devices so it can know who’s in your living room.

The idea being that if you turn on your cable box, it recognizes you and pulls up shows already in your profile or makes recommendations. If parents are watching TV with their children, for example, parental controls could appear to block certain content from appearing on the screen. Kunkel also said this type of monitoring is the “holy grail” because it could help serve up specifically tailored ads. Yikes.

Here’s another source:

Comcast’s Creepy Experiment Puts Cams Inside DVRs to Watch You

In a scene straight out of 1984, Comcast said it will begin placing actual cameras in DVR units to track data for who is watching the digital television.This statement is so farfetched I almost don’t believe it, but it came out of the mouth of Gerard Kunkel, the senior vice president of user experience for Comcast. At the Digital Living Room conference he said that Comcast is already experimenting embedding cameras into DVR boxes that actually watch the television watchers. Big Brother, anyone?

Comcast is shilling this as a type of customization features. The camera would be capable of recognizing specific individuals and therefore loading a user’s favorite channels and on the other hand block certain content as well. Stop the schtick, Comcast. Nobody, and I mean nobody would ever voluntarily allow you to place a camera in a household, for any purpose. It’s a shame that I can already imagine the headlines when Comcast does this involuntarily.

Now, in the comments at both sites, there is disavowal by Comcast executives that the company is actually going to do this – they’re just “looking into it.” Sure.

More importantly, there are a lot of comments about how this is just yet another step into the world of total surveillance, another incremental loss of privacy. Sure, these comments come from tech-savvy people, who are well aware of how the technology may work – moreso than most people. And they are also aware that for many folks, this will be seen as ‘no big deal’, and a welcome convenience.

But the tech-savvy are the ones who will be developing the tools to counter this kind of intrusion. Sooner or later someone will figure out that there is a service to be met, creating a buffer of privacy between the individual and the corporate-government union. It may not be a huge market to begin with, but it will be the first start in the creation of the kind of expert systems I predict.

Jim Downey

(Via MeFi. A slightly shortened version of this has been cross-posted to UTI.)



The one thing you know.

There is one thing, absolutely, that you know – but most people don’t really believe it. That you are alive, and that you are going to die.

“Wait!” you say, “That’s two things!”

No, it’s not. Life and death are two aspects of the same thing. It is the fundamental duality of our nature. Now, the first part of that equation is generally accepted, but the second part is widely denied – hence the desire to split it into two separate items.

But it hasn’t always been like this. Most of human history, people have understood the connection – they were familiar and comfortable with death (even if it wasn’t to be desired). I’d even go so far as to say that much of the world today is still this way. It is really only in the last couple-three generations that those in the richer countries have lost a day-to-day connection with death.

Now, I lost my parents in my early adolescence, one to violence and the other to accident. I came to understand death, and mortality, just at the time when my world view was being shaped, just as I was developing the ability to understand the world in abstract terms. This made me different than most of my contemporaries, though more like how most humans have existed through history. Even through my crazy teen years I never once thought that I would live forever – I had no illusions that death could come suddenly and unexpectedly, and that it would eventually come no matter what I might try to do to postpone it. And while most people come to eventually accept death intellectually, I think that without experiencing it as part of your understanding of the world, you tend to never really internalize it. The more people live with death – whether because of growing up with it, or being immersed in it due to war or disaster – the more they tend to understand and accept it. In insulating ourselves, and our children, from the experience of real death, I think we have cheated ourselves of an understanding of it.

And those things we do not understand – in our gut – we fear. And too often, those things we fear, we deny.

OK, so what am I going on about, talking about death here on this nice, bright, pleasant (but a bit cold) Saturday morning?

This: Universe Today ran a piece a couple of days ago about a proposal by Jim McLane, a NASA engineer of over 20 years who now works for a private engineering firm, to do a one-person, one-way trip to Mars. From the article:

A return to the “get it done” attitude of the 1960’s and a goal of a manned landing within a short time frame, like Apollo, is the only way we’ll get to Mars, McLane believes. Additionally, a no-return, solo mission solves many of the problems currently facing a round-trip, multiple person crew.

“When we eliminate the need to launch off Mars, we remove the mission’s most daunting obstacle,” said McLane. And because of a small crew size, the spacecraft could be smaller and the need for consumables and supplies would be decreased, making the mission cheaper and less complicated.

While some might classify this as a suicide mission, McLane feels the concept is completely logical.

“There would be tremendous risk, yes,” said McLane, “but I don’t think that’s guaranteed any more than you would say climbing a mountain alone is a suicide mission. People do dangerous things all the time, and this would be something really unique, to go to Mars. I don’t think there would be any shortage of people willing to volunteer for the mission. Lindbergh was someone who was willing to risk everything because it was worth it. I don’t think it will be hard to find another Lindbergh to go to Mars. That will be the easiest part of this whole program.”

Now, some variation of this idea has been kicked around previously, even going back to the early days of thinking about getting someone to the Moon. McLane is to be credited with pushing the idea, but it isn’t really original. I’ve seen variations of the idea in SF as well.

Read the column. There is some fudging about whether or not this is really a suicide trip, or whether future tech would allow for the eventual return of the participant, or that this first person would be the initial colonist for an outpost.

But what I found particularly interesting – and insightful – were the attitudes displayed in the extensive comments (almost 200 at the time I am writing this). You only need to sample these to find out that a lot of people are saying that it would be just horrid to “condemn someone to die” for a pointless trip to Mars.

Folks, here’s a reminder: we’re all already condemned to die. Only the timing and manner of our death is unknown.

Plenty of people do things that they know will carry a high risk of death. Some do it for a thrill – there is a decided adrenalin rush in thinking you are going to die (and I think that this explains the popularity of both horror flicks and various games where ‘death’ is a possibility). But for those who understand death, they engage in these risks with an acceptance that while death may come to them, the goal is still worthy. They might be misguided, or misinformed, miscalculating either the amount of risk or the worthiness of the goal. But they are nonetheless making a choice that is not reflected out of fear or ignorance of death – rather, it is saying that they think that the possible timing and manner of their death is worth changing for the goal.

Because that is all you are actually doing when you take any kind of additional risk: saying, effectively, that you are willing to sacrifice some additional time living. You are *not* saying that you are willing to accept non-existence versus existence.  We are not “immortal unless killed” – we are going to die, sooner or later, in the fullness of time.  Get that in your head, and then deciding to do something like take a one-way ticket to Mars doesn’t seem so daunting.

Jim Downey

(Via MeFi.  Cross-posted to UTI.)



Ecclesiastes VIII 15

A good friend and I have a running joke about getting our six chickens and a goat, and retiring from the world to farm while things fall slowly into ruin.

But the thing is, it’s not a joke. Not really.

I’m not saying that everyone should fall into a paranoid spiral, become some kind of survivalist nut. I’m not ready to do that. But when you read something like this, it does make you wonder. An excerpt (please note, I added the embedded links in the following):

For decades, his [James Lovelock’s] advocacy of nuclear power appalled fellow environmentalists – but recently increasing numbers of them have come around to his way of thinking. His latest book, The Revenge of Gaia, predicts that by 2020 extreme weather will be the norm, causing global devastation; that by 2040 much of Europe will be Saharan; and parts of London will be underwater. The most recent Intergovernmental Panel on Climate Change (IPCC) report deploys less dramatic language – but its calculations aren’t a million miles away from his.

* * *

On the day we meet, the Daily Mail has launched a campaign to rid Britain of plastic shopping bags. The initiative sits comfortably within the current canon of eco ideas, next to ethical consumption, carbon offsetting, recycling and so on – all of which are premised on the calculation that individual lifestyle adjustments can still save the planet. This is, Lovelock says, a deluded fantasy. Most of the things we have been told to do might make us feel better, but they won’t make any difference. Global warming has passed the tipping point, and catastrophe is unstoppable.

“It’s just too late for it,” he says. “Perhaps if we’d gone along routes like that in 1967, it might have helped. But we don’t have time. All these standard green things, like sustainable development, I think these are just words that mean nothing. I get an awful lot of people coming to me saying you can’t say that, because it gives us nothing to do. I say on the contrary, it gives us an immense amount to do. Just not the kinds of things you want to do.”

Too late? Yeah, maybe so:

I opened the email to find an article about the most recent “comments and projections” by James Hansen. Hansen, you may know, is perhaps the most famous NASA climate change scientist. He’s the man who testified before Congress twenty years ago that the planet was warming and that people were the source of that warming. He’s the man who was pressured by senior officials at NASA, at the behest of the current administration, to tone down his reports about the impacts of climate change. Thankfully he seems to have resisted that pressure.

I read the article and then I read a related article by Bill McKibben. Hansen says, and McKibben underscores, that there is a critical maximum number of parts per million of carbon dioxide in the atmosphere to heed to prevent climatic catastrophe. That number, he says, is between 300 and 350.

* * *

Can you guess how many ppm of CO2 are in the atmosphere now? Slightly below 350? Slightly above?

We’re at 383 parts per million and counting, well past the number Hansen suggests is critical. We are past it by a lot. We were at 325 parts per million in 1970! Um, I don’t think we can just suck all that carbon back out, ask billions of people not to have been born, tear down all of those new suburban developments, return to non-fossil-based agriculture, and innocently pretend it’s thirty years ago.

So, what to do?

Well, that’s the problem. Lovelock says that you might as well enjoy life while you can, as much as you can, before the shit hits the fan. The second passage, from a very long blog entry evidently by Sally Erickson, explores some options but focuses on the need to convince people that the shit has essentially already hit the fan, in order to radically change behavior sufficient to have a hope to save the world.

I am not sanguine about the prospects of making radical change, nor what that would really mean for our civil liberties. I think, unfortunately, that the mass of humanity just cannot deal with a problem until it becomes an actual, in-your-face emergency, but that once in it, we usually do a fairly decent job of slogging our way out.

This is one of the reasons that I decided to choose a pandemic flu as the cataclysm behind the ‘history’ of Communion of Dreams. As I have discussed previously, I made that decision for reasons of plotting, but also because I actually believe that we’ll likely experience some kind of mass die-off of humanity sometime in the next century, whether due to war, asteroid impact, plague, global warming or some other disaster. We’ve just been too lucky, too long.

But in a way, it is an odd sort of optimism, as reflected in the book, and as shared by James Lovelock (from the same Guardian article):

“There have been seven disasters since humans came on the earth, very similar to the one that’s just about to happen. I think these events keep separating the wheat from the chaff. And eventually we’ll have a human on the planet that really does understand it and can live with it properly. That’s the source of my optimism.”

And not to end it there, here’s a little something for counterpoint, I suppose:

Jim Downey

(Via MeFi here and here.)




Design a site like this with WordPress.com
Get started