Communion Of Dreams


The shape of things to come.

From Chapter 3 of Communion of Dreams:

Apparent Gravity was the third major application of the theories set forth in Hawking’s Conundrum, the great opus of Stephen Hawking which was not published until after his death in the earlier part of the century. He hadn’t released the work because evidently even he couldn’t really believe that it made any sense. It was, essentially, both too simple and too complex. And since he had died just shortly before the Fire-flu, with the chaos that brought, there had been a lag in his theory being fully understood and starting to be applied.

But it did account for all the established data, including much of the stuff that seemed valid but didn’t fit inside the previous paradigms. Using his theories, scientists and engineers learned that the structure of space itself could be manipulated. The first major application led to practical, safe, and efficient fusion power. Rather than forcing high-energy particles together, the forces keeping them apart were just removed. Or, more accurately, the manifestation of space between them was inverted. It took very little energy, was easy to control, but only worked in a very localized fashion

Then there’s this excellent non-technical explanation of a new theory of Shape Dynamics. An excerpt or two:

Their latest offering is something called “shape dynamics.” (If you’ve never heard of shape dynamics, that’s OK—neither have most physicists.) It could, of course, be a dead end, as most bold new ideas in physics are. Or it could be the next great revolution in our conception of the cosmos. Its supporters describe it as a new way of looking at gravity, although it could end up being quite a bit more than that. It appears to give a radical new picture of space and time—and of black holes in particular. It could even alter our view of what’s “real” in the universe.

* * *

In most situations, shape dynamics predicts what Einstein’s theory predicts. “For the vast majority of physical situations, the theories are equivalent,” Gryb says. In other words, the two frameworks are almost identical—but not quite.

Imagine dividing space-time up into billions upon billions of little patches. Within each patch, shape dynamics and general relativity tell the same story, Gryb says. But glue them all together, and a new kind of structure can emerge. For a concrete example of how this can happen, think of pulling together the two ends of a long, narrow strip of paper: Do it the usual way, and you get a loop; do it with a twist and you get a Möbius strip. “If you glue all the regions together to form a kind of global picture of space and time, then that global picture might actually be different.” So while shape dynamics may recreate Einstein’s theory on a small scale, the big-picture view of space and time may be novel.

Another prediction come true?

Not really — there were intimations of this theory when I was working on CoD, and it is a *very* long way from being accepted as valid, as the Nova article discusses.

But still …

 

Jim Downey

PS: new review of Communion of Dreams was put up yesterday. Check it out.

 

Advertisements


Spinning wheels, got to go ’round.*

The reviews have been mixed, but one aspect of the new movie Interstellar is pretty cool: the rendering of the black hole depicted in the movie.  Even moreso since it is as scientifically accurate as possible, based on close collaboration with noted astrophysicist Kip Thorne:

Still, no one knew exactly what a black hole would look like until they actually built one. Light, temporarily trapped around the black hole, produced an unexpectedly complex fingerprint pattern near the black hole’s shadow. And the glowing accretion disk appeared above the black hole, below the black hole, and in front of it. “I never expected that,” Thorne says. “Eugénie just did the simulations and said, ‘Hey, this is what I got.’ It was just amazing.”

In the end, Nolan got elegant images that advance the story. Thorne got a movie that teaches a mass audience some real, accurate science. But he also got something he didn’t expect: a scientific discovery. “This is our observational data,” he says of the movie’s visualizations. “That’s the way nature behaves. Period.” Thorne says he can get at least two published articles out of it.

The video is remarkable. Seriously. Go watch it.

And in a nice bit of serendipity, there’s another fantastic bit of astrophysics in the news just now: actual images of planetary genesis from ALMA. Check it out:

A new image from ALMA, the Atacama Large Millimeter/submillimeter Array, reveals extraordinarily fine detail that has never been seen before in the planet-forming disc around a young star. ALMA’s new high-resolution capabilities were achieved by spacing the antennas up to 15 kilometers apart [1]. This new result represents an enormous step forward in the understanding of how protoplanetary discs develop and how planets form.

ALMA has obtained its most detailed image yet showing the structure of the disc around HL Tau [2], a million-year-old Sun-like star located approximately 450 light-years from Earth in the constellation of Taurus. The image exceeds all expectations and reveals a series of concentric and bright rings, separated by gaps.

 

That’s not computer-rendered theory. That’s an actual image, showing the formation of planets around this very young star.

Wow.

 

Jim Downey

*Naturally.



One reality or t’other.

From Chapter 3 of Communion of Dreams:

Apparent Gravity was the third major application of the theories set forth in Hawking’s Conundrum, the great opus of
Stephen Hawking which was not published until after his death in the earlier part of the century. He hadn’t released the work because evidently even he couldn’t really believe that it made any sense. It was, essentially, both too simple and too complex. And since he had died just shortly before the Fire-flu, with the chaos that brought, there had been a lag in his theory being fully understood and starting to be applied.

But it did account for all the established data, including much of the stuff that seemed valid but didn’t fit inside the previous paradigms. Using his theories, scientists and engineers learned that the structure of space itself could be manipulated.

Of course, that is the reality of St. Cybi’s Well, not our own. In our reality, there’s been no fire-flu (at least yet), Stephen Hawking is still alive, and the laws of physics are still the same.

Well, maybe

Black holes are in crisis. Well, not them, but the people who think about them, theoretical physicists who try to understand the relationship between the two pillars of modern physics, general relativity and quantum physics. Judging from the current discussions, one of the two must go, at least in their present formulation. On January 22nd, Stephen Hawking posted a paper where he bluntly stated that black holes, in the sense of being objects that can trap light and everything else indefinitely, are no more. And that’s a big deal.

Sometimes I wonder what reality I am actually plugged into, since it seems that I keep getting leaks from the other one.

 

Jim Downey



Whither SF?

I’ve mentioned Charlie Stross several times here. As I’ve said previously: smart guy, good writer. I disagree with his belief in mundane science fiction, because I think that it is too limited in imagination. Which leads almost inevitably to this formulation on his blog today (and yes, you should go read the whole thing):

We people of the SF-reading ghetto have stumbled blinking into the future, and our dirty little secret is that we don’t much like it. And so we retreat into the comfort zones of brass goggles and zeppelins (hey, weren’t airships big in the 1910s-1930s? Why, then, are they such a powerful signifier for Victorian-era alternate fictions?), of sexy vampire-run nightclubs and starship-riding knights-errant. Opening the pages of a modern near-future SF novel now invites a neck-chillingly cold draft of wind from the world we’re trying to escape, rather than a warm narcotic vision of a better place and time.

And so I conclude: we will not inspire anyone with grand visions of a viable future through the medium of escapism. If we want to write inspirational literature with grand visions we need to dive into to the literary mainstream (which is finally rediscovering fabulism) and, adding a light admixture of Enlightenment ideology along the way, start writing the equivalent of those earnest and plausible hyper-realistic tales of Progress through cotton-planting on the shores of the Aral sea.

But do you really want us to do that? I don’t think so. In fact, the traditional response of traditional-minded SF readers to the rigorous exercise of extrapolative vision tends to be denial, disorientation, and distaste. So let me pose for you a different question, which has been exercising me for some time: If SF’s core message (to the extent that it ever had one) is obsolete, what do we do next?

Well, I dunno about Charlie, but I plan on writing a couple of prequels to Communion of Dreams, which I understand have touched something of a nerve in people precisely *because* it is hopeful in the face of a harsh reality.

Jim Downey

(PS: sometime today we should break through the level of 500 total sales/loans of CoD so far this month. Which is almost twice the previous month’s tally. Thanks for affirming my vision, folks!)



This ‘n that.

Several things of interest, some personal, some news, some related to the book . . .

* * *

I am struck with how powerful just random chance is in determining the course of events. Whether you agree with the Administration’s handling of it or not, just consider how the BP oil leak in the Gulf has come to dominate the attention and course of politics. Who could have predicted that of all the things happening in the world, this would happen? It’s like getting in a car crash – it sort of shuts out every other factor in your life.

* * *

A couple of people have sent me a link to the NYT item “Merely Human? That’s so yesterday.” It’s a long piece, and worth reading.

I’ve written about Ray Kurzweil and the Singularity previously. Simply put, I find the idea interesting but unconvincing. Kurzweil and the others involved in this ‘Singularity University’ are smart people, and I like that they are pushing for research and the development of technology which will benefit all, but it strikes me as mostly as the technological equivalent of the ‘afterlife’ of most religions – more hope than reality. This quote from the article sums up my thoughts pretty well:

William S. Bainbridge, who has spent the last two decades evaluating grant proposals for the National Science Foundation, also sides with the skeptics.

“We are not seeing exponential results from the exponential gains in computing power,” he says. “I think we are at a time where progress will be increasingly difficult in many fields.

“We should not base ideas of the world on simplistic extrapolations of what has happened in the past,” he adds.

It’s called the Law of Diminishing Returns.

* * *

Which isn’t to say that there cannot be revolutionary breakthroughs which could radically change our lives. I’ve also written about how hydrogen sulfide (H2S) seems to be connected to hibernation, and now comes a fairly breathtaking bit of news that is related:

Mystery Explained: How Frozen Humans Are Brought Back

Yeast and worms can survive hypothermia if they are first subjected to extreme oxygen deprivation, a new study finds.

The results could explain a long-held mystery as to how humans can be brought back to life after “freezing to death,” the scientists say.

The study uncovered a previously unknown ability of organisms to survive lethal cold by temporarily slowing the biological processes that maintain life.

But the really interesting bit was this:

Documented cases of humans successfully revived after spending hours or days without a pulse in extremely cold conditions first inspired Roth to study the relationship between human hypothermia and his own research in forced hibernation.

In the winter of 2001, the body temperature of Canadian toddler Erica Norby plunged to 61 degrees Fahrenheit (16 degrees Celsius) as she lay for hours in below-freezing weather after wandering outside wearing only a diaper. Apparently dead, she recovered completely after being re-warmed and resuscitated.

The same curious fate befell Japanese mountain climber Mitsutaka Uchikoshi in 2006, who was discovered with a core body temperature of 71 degrees F (22 degrees C) after 23 days after falling asleep on a snowy mountain.

23 DAYS? Holy shit, I hadn’t been aware of that.

* * *

And lastly, you probably heard about this:

KABUL, Afghanistan – U.S. geologists have discovered vast mineral wealth in Afghanistan, possibly amounting to $1 trillion, President Hamid Karzai’s spokesman said Monday.

Waheed Omar told reporters the findings were made by the U.S. Geological Survey under contract to the Afghan government.

* * *

Americans discovered nearly $1 trillion in untapped mineral deposits in Afghanistan, including iron, copper, cobalt, gold and critical industrial metals like lithium, according to the report. The Times quoted a Pentagon memo as saying Afghanistan could become the “Saudi Arabia of lithium,” a key raw material in the manufacture of batteries for laptops and cell phones.

Sounds like a brilliant bit of good news? Think about it again. As someone on MetaFilter commented:

Oh man, I wish I could feel optimistic about this… but a homeless guy with no hope and no prospects, who finds a gold watch, still has no hope and no prospects, but now he’s in for a beating too.

Did you ever read The Prize? Same thing. The ore sources for some of these minerals are very rare, they are critical for many high-tech products, and there is going to be a scramble to make sure who winds up in control of them.

* * *

Random chance rules our lives.

Jim Downey



The memory remains.

Just now, my good lady wife was through to tell me that she’s off to take a bit of a nap. Both of us are getting over a touch of something (which I had mentioned last weekend), and on a deeper level still recovering from the profound exhaustion of having been care-givers for her mom.

Anyway, as she was preparing to head off, one of our cats insisted on going through the door which leads from my office into my bindery. This is where the cat food is.

“She wants through.”

“She wants owwwwt.”

“Any door leads out, as far as a cat is concerned.”

“Well, that door did once actually lead out, decades ago.”

“She remembers.”

“She can’t remember.”

“Nonetheless, the memory lingers.”

* * * * * * *

Via TDG, a fascinating interview with Douglas Richard Hofstadter last year, now translated into English. I’d read his GEB some 25 years ago, and have more or less kept tabs on his work since. The interview was about his most recent book, and touched on a number of subjects of interest to me, including the nature of consciousness, writing, Artificial Intelligence, and the Singularity. It’s long, but well worth the effort.

In discussing consciousness (which Hofstadter calls ‘the soul’ for reasons he explains), and the survival of shards of a given ‘soul’, the topic of writing and music comes up. Discussing how Chopin’s music has enabled shards of the composer’s soul to persist, Hofstadter makes this comment about his own desire to write:

I am not shooting at immortality through my books, no. Nor do I think Chopin was shooting at immortality through his music. That strikes me as a very selfish goal, and I don’t think Chopin was particularly selfish. I would also say that I think that music comes much closer to capturing the essence of a composer’s soul than do a writer’s ideas capture the writer’s soul. Perhaps some very emotional ideas that I express in my books can get across a bit of the essence of my soul to some readers, but I think that Chopin’s music probably does a lot better job (and the same holds, of course, for many composers).

I personally don’t have any thoughts about “shooting for immortality” when I write. I try to write simply in order to get ideas out there that I believe in and find fascinating, because I’d like to let other people be able share those ideas. But intellectual ideas alone, no matter how fascinating they are, are not enough to transmit a soul across brains. Perhaps, as I say, my autobiographical passages — at least some of them — get tiny shards of my soul across to some people.

Exactly.

* * * * * * *

In April, I wrote this:

I’ve written only briefly about my thoughts on the so-called Singularity – that moment when our technological abilities converge to create a new transcendent artificial intelligence which encompasses humanity in a collective awareness. As envisioned by the Singularity Institute and a number of Science Fiction authors, I think that it is too simple – too utopian. Life is more complex than that. Society develops and copes with change in odd and unpredictable ways, with good and bad and a whole lot in the middle.

Here’s Hofstadter’s take from the interview, in responding to a question about Ray Kurzweil‘s notion of achieving effective immortality by ‘uploading’ a personality into a machine hardware:

Well, the problem is that a soul by itself would go crazy; it has to live in a vastly complex world, and it has to cohabit that world with many other souls, commingling with them just as we do here on earth. To be sure, Kurzweil sees those things as no problem, either — we’ll have virtual worlds galore, “up there” in Cyberheaven, and of course there will be souls by the barrelful all running on the same hardware. And Kurzweil sees the new software souls as intermingling in all sorts of unanticipated and unimaginable ways.

Well, to me, this “glorious” new world would be the end of humanity as we know it. If such a vision comes to pass, it certainly would spell the end of human life. Once again, I don’t want to be there if such a vision should ever come to pass. But I doubt that it will come to pass for a very long time. How long? I just don’t know. Centuries, at least. But I don’t know. I’m not a futurologist in the least. But Kurzweil is far more “optimistic” (i.e., depressingly pessimistic, from my perspective) about the pace at which all these world-shaking changes will take place.

Interesting.

* * * * * * *

Lastly, the interview is about the central theme of I am a Strange Loop: that consciousness is an emergent phenomenon which stems from vast and subtle physical mechanisms in the brain. This is also the core ‘meaning’ of GEB, though that was often missed by readers and reviewers who got hung up on the ostensible themes, topics, and playfulness of that book. Hofstadter calls this emergent consciousness a self-referential hallucination, and it reflects much of his interest in cognitive science over the years.

[Mild spoilers ahead.]

In Communion of Dreams I played with this idea and a number of related ones, particularly pertaining to the character of Seth. It is also why I decided that I needed to introduce a whole new technology – based on the superfluid tholin-gel found on Titan, as the basis for the AI systems at the heart of the story. Because the gel is not human-manufactured, but rather something a bit mysterious. Likewise, the use of this material requires another sophisticated computer to ‘boot it up’, and then it itself is responsible for sustaining the energy matrix necessary for continued operation. At the culmination of the story, this ‘self-referential hallucination’ frees itself from its initial containment.

Why did I do this?

Partly in homage to Hofstedter (though you will find no mention of him in the book, as far as I recall). Partly because it plays with other ideas I have about the nature of reality. If we (conscious beings) are an emergent phenomenon, arising from physical activity, then it seems to me that physical things can be impressed with our consciousness. This is why I find his comments about shards of a soul existing beyond the life of the body of the person to be so intriguing.

So I spent some 130,000 words exploring that idea in Communion. Not overtly – not often anyway – but that is part of the subtext of what is going on in that book.

* * * * * * *

“Any door leads out, as far as a cat is concerned.”

“Well, that door did once actually lead out, decades ago.”

“She remembers.”

“She can’t remember.”

“Nonetheless, the memory lingers,” I said, “impressed on the door itself. Maybe the cat understands that at a level we don’t.”

Jim Downey

(Related post at UTI.)



Convergence.

When I went away to college in 1976, I took with me the small black & white television I had received for my eighth birthday. Mostly my roommates and I would watch The Muppet Show before going off to dinner. Otherwise, I really didn’t have the time for television – there was studying to do, drugs and alcohol to abuse, sex to have.

Post college I had a massive old console color TV I had inherited. But given that I lived in Montezuma Iowa, reception was dismal. I found other things to do with my time, mostly SCA-related activities and gaming. I took that console set with me to graduate school in Iowa City, but it never really worked right, and besides I was still busy with SCA stuff and again with schoolwork.

For most of the ’90s I did watch some TV as it was being broadcast, but even then my wife and I preferred to time-shift using a VCR, skipping commercials and seeing the things we were interested in at times when it was convenient for us.

This century, living here and caring for someone with Alzheimer’s, we had to be somewhat more careful about selecting shows that wouldn’t contribute to Martha Sr’s confusion and agitation. Meaning mostly stuff we rented or movies/series we liked well enough to buy on DVD. I would now and then flip on the cable and skip around a bit after we got Martha Sr. to bed, see if there was anything interesting, but for the most part I relied on friends recommending stuff. And besides, I was busy working on Communion of Dreams, or blogging here or there, or writing a newspaper column or whatever.

Now-a-days we don’t even have cable. There’s just no reason to pay for it. I’d much rather get my news and information online. So, basically, I have missed most every television show and special event in the last thirty years. There are vast swaths of cultural reference I only know by inference, television shows that “define” American values I’ve never seen. I don’t miss it.

And you know what? You are becoming like me, more and more all the time.

* * * * * * *

Via Cory Doctorow at BoingBoing, this very interesting piece by

Gin, Television, and Social Surplus

* * *

If I had to pick the critical technology for the 20th century, the bit of social lubricant without which the wheels would’ve come off the whole enterprise, I’d say it was the sitcom. Starting with the Second World War a whole series of things happened–rising GDP per capita, rising educational attainment, rising life expectancy and, critically, a rising number of people who were working five-day work weeks. For the first time, society forced onto an enormous number of its citizens the requirement to manage something they had never had to manage before–free time.

And what did we do with that free time? Well, mostly we spent it watching TV.

We did that for decades. We watched I Love Lucy. We watched Gilligan’s Island. We watch Malcolm in the Middle. We watch Desperate Housewives. Desperate Housewives essentially functioned as a kind of cognitive heat sink, dissipating thinking that might otherwise have built up and caused society to overheat.

And it’s only now, as we’re waking up from that collective bender, that we’re starting to see the cognitive surplus as an asset rather than as a crisis. We’re seeing things being designed to take advantage of that surplus, to deploy it in ways more engaging than just having a TV in everybody’s basement.

OK, I try and be very careful about “fair use” of other people’s work, limiting myself to just a couple of paragraphs from a given article or blog post in order to make a point. But while I say that you should go read his whole post, I’m going to use another passage from Shirky here:

Did you ever see that episode of Gilligan’s Island where they almost get off the island and then Gilligan messes up and then they don’t? I saw that one. I saw that one a lot when I was growing up. And every half-hour that I watched that was a half an hour I wasn’t posting at my blog or editing Wikipedia or contributing to a mailing list. Now I had an ironclad excuse for not doing those things, which is none of those things existed then. I was forced into the channel of media the way it was because it was the only option. Now it’s not, and that’s the big surprise. However lousy it is to sit in your basement and pretend to be an elf, I can tell you from personal experience it’s worse to sit in your basement and try to figure if Ginger or Mary Ann is cuter.

And I’m willing to raise that to a general principle. It’s better to do something than to do nothing. Even lolcats, even cute pictures of kittens made even cuter with the addition of cute captions, hold out an invitation to participation. When you see a lolcat, one of the things it says to the viewer is, “If you have some fancy sans-serif fonts on your computer, you can play this game, too.” And that message–I can do that, too–is a big change.

It is a huge change. It is the difference between passively standing/sitting by and watching, and doing the same thing yourself. Whether it is sports, or sex, or politics, or art – doing it yourself means making better use of the limited time you have in this life.

* * * * * * *

And now, the next component of my little puzzle this morning.

Via MeFi, this NYT essay about the explosion of authorship:

You’re an Author? Me Too!

It’s well established that Americans are reading fewer books than they used to. A recent report by the National Endowment for the Arts found that 53 percent of Americans surveyed hadn’t read a book in the previous year — a state of affairs that has prompted much soul-searching by anyone with an affection for (or business interest in) turning pages. But even as more people choose the phantasmagoria of the screen over the contemplative pleasures of the page, there’s a parallel phenomenon sweeping the country: collective graphomania.

In 2007, a whopping 400,000 books were published or distributed in the United States, up from 300,000 in 2006, according to the industry tracker Bowker, which attributed the sharp rise to the number of print-on-demand books and reprints of out-of-print titles. University writing programs are thriving, while writers’ conferences abound, offering aspiring authors a chance to network and “workshop” their work. The blog tracker Technorati estimates that 175,000 new blogs are created worldwide each day (with a lucky few bloggers getting book deals). And the same N.E.A. study found that 7 percent of adults polled, or 15 million people, did creative writing, mostly “for personal fulfillment.”

* * *

Mark McGurl, an associate professor of English at the University of California, Los Angeles, and the author of a forthcoming book on the impact of creative writing programs on postwar American literature, agrees that writing programs have helped expand the literary universe. “American literature has never been deeper and stronger and more various than it is now,” McGurl said in an e-mail message. Still, he added, “one could put that more pessimistically: given the manifold distractions of modern life, we now have more great writers working in the United States than anyone has the time or inclination to read.”

An interesting discussion about this happens in that thread at Meta Filter. John Scalzi, no stranger at all to the world of blogging and online publishing, says this there:

I see nothing but upside in people writing and self-publishing, especially now that companies like Lulu make it easy for them to do so without falling prey to avaricious vanity presses. People who self-publish are in love with the idea of writing, and in love with the idea of books. Both are good for me personally, and good for the idea of a literate society moving forward.

Indeed. And it is pretty clearly a manifestation of what Shirky is talking about above.

I’ve written only briefly about my thoughts on the so-called Singularity – that moment when our technological abilities converge to create a new transcendent artificial intelligence which encompasses humanity in a collective awareness. As envisioned by the Singularity Institute and a number of Science Fiction authors, I think that it is too simple – too utopian. Life is more complex than that. Society develops and copes with change in odd and unpredictable ways, with good and bad and a whole lot in the middle.

For years, people have bemoaned how the developing culture of the internet is changing for the worse aspects of life. Newspapers are struggling. There’s the whole “Cult of the Amateur” nonsense. Just this morning on NPR there was a comment from a listener about how “blogs are just gossip”, in reaction to the new Sunday Soapbox political blog WESun has launched. And there is a certain truth to the complaints and hand-wringing. Maybe we just need to see this in context, though – that the internet is just one aspect of our changing culture, something which is shifting us away from being purely observers of the complex and confusing world around us, to being participants to a greater degree.

Sure, a lot of what passes for participation is fairly pointless, time-consuming crap in its own right. I am reminded of this brilliant xkcd strip. The activity itself is little better than just watching reruns of Gilligan’s Island or Seinfeld or whatever. But the *act* of participating is empowering, and instructive, and just plain good exercise – preparing the participant for being more involved, more in control of their own life and world.

We learn by doing. And if, by doing, we escape the numbing effects of being force-fed pablum from the television set for even a little while, that’s good. What if our Singularity is not a technological one, but a social one? What if, as people become more active, less passive, we actually learn to tap into the collective intelligence of humankind – not as a hive mind, but as something akin to an ideal Jeffersonian Democracy, updated to reflect the reality of modern culture?

I think we could do worse.

Jim Downey