Communion Of Dreams


More than skin deep.

Good article at Scientific American about the coming medical monitoring technology. Excerpt:

“Why don’t we have a similar vision for our bodies?” wonders Gustafsson, an engineer whose team at the Swedish electronics company Acreo, based in Kista, is one of many around the world trying to make such a vision possible. Instead of letting health problems go undetected until a person ends up in hospital—the medical equivalent of a roadside breakdown—these teams foresee a future in which humans are wired up like cars, with sensors that form a similar early-warning system.

Working with researchers at Linköping University in Sweden, Gustafsson’s team has developed skin-surface and implanted sensors, as well as an in-body intranet that can link devices while keeping them private.

 

Gee, that sounds familiar. Here’s a passage from Chapter 15 of Communion of Dreams about the remote-monitoring of a ship’s crew through their cyberware, documenting medical conditions during a crisis:

“Main drive has been disengaged, transit rotation to new heading begun. All human crew members of the ship are now experiencing severe physiological stress. Attempting to identify source of this event . . .”

“My god,” gasped someone.

“. . . expert Stepan has become unresponsive. Experts Rurik and Rika attempting to establish control of transit. Several human crew members have expired. Medical telemetry indicates cerebral hemorrhage in most cases. Other crew members experiencing symptoms of shock and heart attack. PC systems attempting to cope. All human crew members seem to be affected. None of the standard emergency protocols sufficient to counteract whatever is occurring. Transit has been stopped. Expert Stepan remains unresponsive. Source of event is indeterminate. There have been no detectable changes to any ship systems, nothing abnormal in environmental controls. Only eight human crew members remain alive, all are critical and unconscious. PC systems reporting imminent death of five of those crew members. Prognosis for remaining three is not good, death is expected within an hour. All medical telemetry will be compiled and transmitted on second channel. ”

 

Another excerpt from the SA article:

To get around that, Strano’s lab has developed synthetic, long-lived detector materials that can be mixed with a water-based gel and injected under the skin like a tattoo. The ‘ink’ for this tattoo consists of carbon nanotubes coated with dangling polymer strands, which have a lock-and-key chemical structure that recognizes biomarkers by dictating which molecules can dock with them. When biomarkers bind to the polymer, they subtly change the optical properties of the nanotube: shine a light on the tattoo, and a glow reveals the presence of the biomarker.

 

Again, from Communion of Dreams:

She nodded. “You know how the palmkey is installed and works, right?”

“Yeah, sure. It’s a thin film injected just under the skin, forms a fluid web across the palm that is programmed to function as a close-range transceiver. Simple enough.”

 

Predictions, predictions …

 

Jim Downey

Thanks to Tim for the heads-up!



“You remember the spider that lived in a bush outside your window? Orange body, green legs.”

Of late, as I have been slowly getting over the rather nasty bout of parainfluenza I mentioned previously, shedding the more annoying and disgusting symptoms, I’ve also come to realize that just now I am pulling out of the depressive trough of one of my long-term bipolar cycles.  It wasn’t a particularly bad trough, and was somewhat mitigated by the success of the Kickstarter back in the fall. Nonetheless, it was there, as I can see in hindsight.

I am frequently struck just how much of our life doesn’t make sense until seen from a distance. Just recently I was surprised at the revelation of *why* the failure of Her Final Year to be more successful bothered me as much as it did: it was because I had seen the book as being a way to create something positive (for the world) out of the experience of being a long-term care provider. To have the book only reach a limited audience was, in my mind, saying that our roles as care-givers didn’t matter.

Which isn’t true, of course, but that was the emotional reality which I had been dealing with. The “narrative truth”, if you will. A term I borrow from a very interesting meditation by Oliver Sacks at the New York Review of Books website titled Speak, Memory. From the article:

There is, it seems, no mechanism in the mind or the brain for ensuring the truth, or at least the veridical character, of our recollections. We have no direct access to historical truth, and what we feel or assert to be true (as Helen Keller was in a very good position to note) depends as much on our imagination as our senses. There is no way by which the events of the world can be directly transmitted or recorded in our brains; they are experienced and constructed in a highly subjective way, which is different in every individual to begin with, and differently reinterpreted or reexperienced whenever they are recollected. (The neuroscientist Gerald M. Edelman often speaks of perceiving as “creating,” and remembering as “recreating” or “recategorizing.”) Frequently, our only truth is narrative truth, the stories we tell each other, and ourselves—the stories we continually recategorize and refine. Such subjectivity is built into the very nature of memory, and follows from its basis and mechanisms in the human brain. The wonder is that aberrations of a gross sort are relatively rare, and that, for the most part, our memories are relatively solid and reliable.

Let me repeat one bit of that: “Frequently, our only truth is narrative truth, the stories we tell each other, and ourselves.”

I think this is at the very heart of why fiction has such power, and appeal. I also think that it explains the well-documented phenomenon of people believing things which are clearly and demonstratively false, if their facts come from a trusted source.

Little surprise that writers of fiction are aware of this very human trait, and have explored it in all manner of ways. I have a note here on my desk, a scrawl written on a scrap of paper some months ago as I was thinking through character motivations in St. Cybi’s Well, which says simply: “We take our truths from the people we trust.”

And here’s another example, from one of my favorite movies, exploring a favorite theme of Philip K. Dick’s:

 

That theme? The nature of reality.  And this is how the Sacks essay closes:

Indifference to source allows us to assimilate what we read, what we are told, what others say and think and write and paint, as intensely and richly as if they were primary experiences. It allows us to see and hear with other eyes and ears, to enter into other minds, to assimilate the art and science and religion of the whole culture, to enter into and contribute to the common mind, the general commonwealth of knowledge. This sort of sharing and participation, this communion, would not be possible if all our knowledge, our memories, were tagged and identified, seen as private, exclusively ours. Memory is dialogic and arises not only from direct experience but from the intercourse of many minds.

In other words, that reality is a shared construct. A Communion of Dreams, if you will.

Time for me to get back to work.

 

Jim Downey



Emit fo worra.*
December 16, 2009, 11:27 am
Filed under: Art, Cosmic Variance, Science, Scientific American, YouTube

Nice – here’s another show for “The Explosions Channel“:

Jim Downey

(Via MeFi. *Apologies to Sean Carroll.)



Clever monkeys, part II.

OK, this was kicking around in the back of my head when I wrote the post the other day, because I have had a page from the June 6th Economist sitting on my bench for the last several weeks, waiting for me to get around to writing about it.

About what? Us clever monkeys. Well, more accurately, our genes, but for purposes of discussion here I will say the two are functionally the same over the time span I wish to address. (Which, when you think about it, is a rather profound notion. No, this is not my idea.)

The idea discussed in the article is this: that the development of modern human culture was dependent not on intelligence, but on something more basic – survival. Specifically, on population density:

In their model, Dr Thomas and his colleagues divided a simulated world into regions with different densities of human groups. Individuals in these groups had certain “skills”, each with an associated degree of complexity. Such skills could be passed on, more or less faithfully, thus yielding an average level of skills that could vary over time. The groups could also exchange skills.

The model suggested that once more than about 50 groups were in contact with one another, the complexity of skills that could be maintained did not increase as the number of groups increased. Rather, it was population density that turned out to be the key to cultural sophistication. The more people there were, the more exchange there was between groups and the richer the culture of each group became.

Dr Thomas therefore suggests that the reason there is so little sign of culture until 90,000 years ago is that there were not enough people to support it. It is at this point that a couple of places in Africa—one in the southernmost tip of the continent and one in eastern Congo—yield signs of jewellery, art and modern weapons. But then they go away again. That, Dr Thomas suggests, corresponds with a period when human numbers shrank. Climate data provides evidence this shrinkage did happen.

Now, this is a fairly old trope in Science Fiction: that some cataclysm can result in the complete collapse of society, to the extent that most if not all knowledge and technology is lost. Just look at The Time Machine to see how far back this idea goes – and it has been used countless times since. I play off this trope for Communion of Dreams in a couple of ways, of course, using it as both back story for the novel and for the eventual revelation at the end of the book.

It is interesting to see this intuitive idea borne out by some science (though it sounds to me like there’s still a fair amount of work to be done to establish that the theory is correct). And not just because it addresses some curious discontinuities in the archeological record. Rather, it says that intelligence has considerable staying power, at least in our species. Sure, it may not be a sufficient factor in supporting true civilization, but knowing that at least in our case it can last some 100,000 years gives one hope for it lasting for a while elsewhere, even if those civilizations do not.

Jim Downey



The choices we make.
July 27, 2008, 9:50 am
Filed under: Alzheimer's, Health, Hospice, Science, Scientific American, Society

The human mind is a remarkable device. Nevertheless, it is not without limits. Recently, a growing body of research has focused on a particular mental limitation, which has to do with our ability to use a mental trait known as executive function. When you focus on a specific task for an extended period of time or choose to eat a salad instead of a piece of cake, you are flexing your executive function muscles. Both thought processes require conscious effort-you have to resist the temptation to let your mind wander or to indulge in the sweet dessert. It turns out, however, that use of executive function—a talent we all rely on throughout the day—draws upon a single resource of limited capacity in the brain. When this resource is exhausted by one activity, our mental capacity may be severely hindered in another, seemingly unrelated activity. (See here and here.)

Imagine, for a moment, that you are facing a very difficult decision about which of two job offers to accept. One position offers good pay and job security, but is pretty mundane, whereas the other job is really interesting and offers reasonable pay, but has questionable job security. Clearly you can go about resolving this dilemma in many ways. Few people, however, would say that your decision should be affected or influenced by whether or not you resisted the urge to eat cookies prior to contemplating the job offers. A decade of psychology research suggests otherwise. Unrelated activities that tax the executive function have important lingering effects, and may disrupt your ability to make such an important decision. In other words, you might choose the wrong job because you didn’t eat a cookie.

Read the whole thing.

* * * * * * *

Almost a year ago I wrote this:

There’s a phenomenon familiar to those who deal with Alzheimer’s. It’s called “sundowning“. There are a lot of theories about why it happens, my own pet one is that someone with this disease works damned hard all day long to try and make sense of the world around them (which is scrambled to their perceptions and understanding), and by late in the afternoon or early evening, they’re just worn out. You know how you feel at the end of a long day at work? Same thing.

* * * * * * *

We cared for Martha Sr for about four years.  Well, we were here helping her for a couple of years prior to that.  But the nearly constant care giving lasted for about four, growing in intensity during that time, culminating with nearly six months of actual hospice care.

That was a long time.  But my wife and I had each other, and it could have been longer.

That same day, a hospice patient named Michelle passed away. She was only 50 years old. She’d been battling MS for over 20 years. Debra is dispatched to her home.

The little brown house is shrouded by trees. Stray cats eat free food on the rusted red porch. Inside, Michelle lies in her hospital bed with her eyes slightly open. Debra’s there to help Michelle’s husband Ross. He quit his job in 2000 to take care of his wife.

“So eight years,” Debra says.

“She was permanently bedridden,” Ross replies. “This is the way it’s been. But like everything in life, it all comes to an end I guess.”

His voice sounds steady when he speaks, but his eyes are full of tears as he remembers his wife.

“I’ve never seen a women fight something like she did,” Ross says. “She spent years on that walker because she knew when she got in a chair she’d never get out. The pain it caused her.”

Ross talks for more than an hour. Debra listens and commiserates. It’s at these moments, even more than when she’s providing medical care, that Debra feels her work is appreciated.

Appreciated, indeed.

* * * * * * *

Jim Downey



Reality is what happens to you while you’re busy coming up with other theories.*

*Apologies to both John Lennon and Philip K. Dick.

Last Saturday, my sister and her husband came to town, and we celebrated Thanksgiving.  Yes, about six months late.

* * * * * * *

About two weeks ago Sean Carroll of Cosmic Variance had a teaser post up about a new article of his in Scientific American.  Carroll has long been one of my favorite reads in cosmology, and his discussion of the cosmological basis for time’s arrow was delightful.  From the opening of the article:

Among the unnatural aspects of the universe, one stands out: time asymmetry. The microscopic laws of physics that underlie the behavior of the universe do not distinguish between past and future, yet the early universe—hot, dense, homogeneous—is completely different from today’s—cool, dilute, lumpy. The universe started off orderly and has been getting increasingly disorderly ever since. The asymmetry of time, the arrow that points from past to future, plays an unmistakable role in our everyday lives: it accounts for why we cannot turn an omelet into an egg, why ice cubes never spontaneously unmelt in a glass of water, and why we remember the past but not the future. And the origin of the asymmetry we experience can be traced all the way back to the orderliness of the universe near the big bang. Every time you break an egg, you are doing observational cosmology.

The arrow of time is arguably the most blatant feature of the universe that cosmologists are currently at an utter loss to explain. Increasingly, however, this puzzle about the universe we observe hints at the existence of a much larger spacetime we do not observe. It adds support to the notion that we are part of a multiverse whose dynamics help to explain the seemingly unnatural features of our local vicinity.

Carroll goes on to explore what those hints (and the implications of same) are in some detail, though all of it is suitable for a non-scientist.  The basic idea of how to reconcile the evident asymmetry is to consider our universe, as vast and ancient as it is, as only one small part of a greater whole.  We are living, as it were, in a quantum flux of the froth of spacetime of a larger multiverse:

Emit fo Worra
This scenario, proposed in 2004 by Jennifer Chen of the University of Chicago and me, provides a provocative solution to the origin of time asymmetry in our observable universe: we see only a tiny patch of the big picture, and this larger arena is fully time-symmetric. Entropy can increase without limit through the creation of new baby universes.

Best of all, this story can be told backward and forward in time. Imagine that we start with empty space at some particular moment and watch it evolve into the future and into the past. (It goes both ways because we are not presuming a unidirectional arrow of time.) Baby universes fluctuate into existence in both directions of time, eventually emptying out and giving birth to babies of their own. On ultralarge scales, such a multiverse would look statistically symmetric with respect to time—both the past and the future would feature new universes fluctuating into life and proliferating without bound. Each of them would experience an arrow of time, but half would have an arrow that was reversed with respect to that in the others.

A tantalizing hint of a larger picture, indeed.

* * * * * * *

Philip K. Dick, tormented mad genius that he was, said something that has become something of a touchstone for me:  “Reality is that which, when you stop believing in it, doesn’t go away.”

It is, in fact, a large part of the basis for my skeptical attitude towards life.  But it also leaves open the idea of examining and incorporating new information which might be contrary to my beliefs.  It is this idea which I explored over the 132,000 words of Communion of Dreams, though not everyone realizes this at first reading.

But what if reality only exists if you believe in it?

That’s a question discussed in another longish piece of science writing in the current issue of Seed Magazine, titled The Reality Tests:

Most of us would agree that there exists a world outside our minds. At the classical level of our perceptions, this belief is almost certainly correct. If your couch is blue, you will observe it as such whether drunk, in high spirits, or depressed; the color is surely independent of the majority of your mental states. If you discovered your couch were suddenly red, you could be sure there was a cause. The classical world is real, and not only in your head. Solipsism hasn’t really been a viable philosophical doctrine for decades, if not centuries.

But that reality goes right up against one of the basic notions of quantum mechanics: the Heisenberg Uncertainty Principle.  Or does it?  For decades, the understanding of quantum effects was that it was applicable at the atomic-and-smaller level.  Only in such rare phenomenon as a Bose-Einstein Condensate (which in Communion is the basis for some of the long-range sensors being used to search for habitable planets outside our solar system) were quantum effects seen at a macroscopic scale.  But in theory, maybe our whole reality operates at a quantum level, regardless of scale:

Brukner and Kofler had a simple idea. They wanted to find out what would happen if they assumed that a reality similar to the one we experience is true—every large object has only one value for each measurable property that does not change. In other words, you know your couch is blue, and you don’t expect to be able to alter it just by looking. This form of realism, “macrorealism,” was first posited by Leggett in the 1980s.

Late last year Brukner and Kofler showed that it does not matter how many particles are around, or how large an object is, quantum mechanics always holds true. The reason we see our world as we do is because of what we use to observe it. The human body is a just barely adequate measuring device. Quantum mechanics does not always wash itself out, but to observe its effects for larger and larger objects we would need more and more accurate measurement devices. We just do not have the sensitivity to observe the quantum effects around us. In essence we do create the classical world we perceive, and as Brukner said, “There could be other classical worlds completely different from ours.”

Indeed.

* * * * * * *

Last Saturday, my sister and her husband came to town, and we celebrated Thanksgiving.  Yes, about six months late.   Because last year, going in to the usual Thanksgiving holiday, we had our hands full caring for Martha Sr and didn’t want to subject her to the disconcerting effect of having ‘strangers’ in the house.  Following Martha Sr’s death in February, other aspects of life had kept either my sister or us busy and unable to schedule a time to get together.

Until last weekend.  And that’s OK.  Because life is what we make of it.  Whether that applies to cosmology or not I’ll leave up to the scientists and philosophers for now (though I have weighed in on the matter as mentioned above and reserve the right to do so again in other books).  This I can tell you – it was good to see my sister and her husband, and the turkey dinner we ate was delicious.

Jim Downey



I admit it. I’m an addict.

Hello, my name is Jim. I’ve got a writing problem.

Via PZ and Evolutionblog, news that blogging (and writing in general) is actually a therapeutic form of self-medication:

Blogging–It’s Good for You

Self-medication may be the reason the blogosphere has taken off. Scientists (and writers) have long known about the therapeutic benefits of writing about personal experiences, thoughts and feelings. But besides serving as a stress-coping mechanism, expressive writing produces many physiological benefits. Research shows that it improves memory and sleep, boosts immune cell activity and reduces viral load in AIDS patients, and even speeds healing after surgery. A study in the February issue of the Oncologist reports that cancer patients who engaged in expressive writing just before treatment felt markedly better, mentally and physically, as compared with patients who did not.

Scientists now hope to explore the neurological underpinnings at play, especially considering the explosion of blogs. According to Alice Flaherty, a neuroscientist at Harvard University and Massachusetts General Hospital, the placebo theory of suffering is one window through which to view blogging. As social creatures, humans have a range of pain-related behaviors, such as complaining, which acts as a “placebo for getting satisfied,” Flaherty says. Blogging about stressful experiences might work similarly.

Flaherty, who studies conditions such as hypergraphia (an uncontrollable urge to write) and writer’s block, also looks to disease models to explain the drive behind this mode of communication. For example, people with mania often talk too much. “We believe something in the brain’s limbic system is boosting their desire to communicate,” Flaherty explains. Located mainly in the midbrain, the limbic system controls our drives, whether they are related to food, sex, appetite, or problem solving. “You know that drives are involved [in blogging] because a lot of people do it compulsively,” Flaherty notes. Also, blogging might trigger dopamine release, similar to stimulants like music, running and looking at art.

OK, I don’t know about doing it ‘compulsively’, but I do know that writing has always been a way for me to cope with stressful events in my life, and I can honestly say that writing about caring for Martha Sr for the last year of her life with Alzheimer’s helped me keep some hold on my sanity.

Likewise, writing at UTI about the absurdities of modern life, with a particular emphasis on the effect of religion and politics, allows me to blow off a little steam and keep things in perspective. Some dialog with others, getting feedback and another perspective, also helps, and is the appeal (to me) of blogging over just writing for myself. This blog has a different emphasis, though there is some overlap (and why I cross post a fair amount between the two). I tend to be more personal here, and to tie things more often to the vision of the future portrayed in Communion of Dreams.

And as addictions go, it’s a lot less self-destructive than many options.

Jim Downey

(A slightly different version of this is at UTI.)



It’s a little weird . . .
April 24, 2008, 11:06 am
Filed under: Science, Scientific American, tech

I’ll turn 50 in a couple of months. It’s a little weird to realize that barely more time than that is required to go back from my date of birth to the first powered flight of the Wright brothers.

But, via TDG, this delightful bit from Scientific American:

100 Years Ago in Scientific American:

The Wright Brothers’ First Flight

An article from the May 1908 issue of Scientific American

Complete with the text and cover from that issue.

Wild.

Jim Downey



Fun with quantum weirdness.

Scientific American has a great piece about making a quantum eraser at home, with complete explanations of the science behind the experiment. Even if you don’t want to try the experiment yourself, read the article and be sure to go through the slide show of the experiment. Wonderful stuff, and explained very well.

I love physics, and when I was young (up into junior-senior high school) wanted to go into some branch of physics as a career. Alas, I discovered that my aptitude for math wasn’t sufficient, and followed other interests. But I have continued to read and keep up with the advance of physics on a ‘popular science’ sort of level. And I have enough respect for the things that science has provided us that I tried to stay true to known science when writing Communion.

[Spoiler alert.] The single biggest leap in Communion is the bit in chapter three where I talk about Hawking’s Conundrum – the supposed break-through treatise that Stephen Hawking writes, but leaves to be published after his death. I only describe the revolution in physics that it creates, and the technology that it enables, I don’t actually try and explain how it works. Because I’m just not that smart, nor even knowledgeable enough to fake being that smart. Fortunately, the standards of science fiction writing are such that we don’t actually have to come up with complete explanations for everything.

Still, if you take the supposition that such a breakthrough in theory does occur, which somehow resolves some of the glitches in both quantum mechanics and string theory the same way that the theory of relativity resolved some of the problems with Newtonian physics, then add in sufficient time for the implications of the theory to be understood and applied, then most of what follows in the book should be accurate. No, really.

I am reminded of a half-remembered anecdote (and if anyone can remember it more completely, please drop me a note or leave a comment – I’d be much obliged). I believe that it was Jerry Siegel, one of the creators of Superman, who once was challenged by a reader asking how Superman was able to do some particular thing. Siegel replied that he could say that it was due to this, or that, but that basically it was because he created the character and said so.

I’ve always loved that. Yeah, sure, you want to have enough plausibility to allow the reader to suspend their disbelief, but when fans get so wrapped up in all the insane details of some piece of fiction (whether it is Superman, Star Trek, or Harry Potter), then I can’t help but feel that they’ve lost a bit of perspective, and can no longer appreciate the forest for being focused on which particular lichen is growing on a rock at the foot of one of the trees. Don’t get me wrong – I would *love* to have the kind of fan base that would so get into my book that they would get sucked in to the minutia of my universe – but the larger story in each of those cases is more important, just as I like to think that the larger story of Communion is more important than the details of the tech used.

Jim Downey