Filed under: Genetic Testing, Preparedness, Science, Science Fiction, Synesthesia, Titan, Writing stuff
I’ve written previously about the emergence of consciousness and the role that the biochemical stew in our heads plays in awareness and cognition. But let’s get a little more basic in our analysis. Let’s consider fire. No, not the slow fire of chemical reactions in our bodies, but actual burning of wood, and the role that it may have played in the development of human intelligence.
* * * * * * *
The Greek myth of Prometheus bringing the holy fire of Zeus to mankind, and thereby enabling civilization, has usually been understood as being an explanation of the role which technology plays in human development. After all, fire allows humans to control our environment, from living in colder climates to clearing land for farming to metallurgy. And, of course, to cook food, making a wider range of nutrients available.
But what if fire made human thought itself possible?
Cooking and Cognition: How Humans Got So Smart
After two tremendous growth spurts — one in size, followed by an even more important one in cognitive ability — the human brain is now a lot like a teenage boy.
It consumes huge amounts of calories, is rather temperamental and, when harnessed just right, exhibits incredible prowess. The brain’s roaring metabolism, possibly stimulated by early man’s invention of cooking, may be the main factor behind our most critical cognitive leap, new research suggests.
OK, that article is a little light on actual information. So I went to check the research paper. It’s a bit thick, but the basic idea was to study the rise of human cognition via two methods:
In this study, we attempted to identify molecular mechanisms involved in the evolution of human-specific cognitive abilities by combining biological data from two research directions: evolutionary and medical. Firstly, we identify the molecular changes that took place on the human evolutionary lineage, presumably due to positive selection. Secondly, we consider molecular changes observed in schizophrenia, a psychiatric disorder believed to affect such human cognitive functions as the capacity for complex social relations and language [6–12]. Combining the two datasets, we test the following prediction: if a cognitive disorder, such as schizophrenia, affects recently evolved biological processes underlying human-specific cognitive abilities, we anticipate finding a significant overlap between the recent evolutionary and the pathological changes. Furthermore, if such significant overlap is observed, the overlapping biological processes may provide insights into molecular changes important for the evolution and maintenance of human-specific cognitive abilities.
Got that? First, you determine the differences due to evolution (specifically as seen in DNA/mRNA divergence between us and chimpanzees), then you see where the brains of people who have schizophrenia are different from ‘normal’ brains. That can give you some indications of how cognition works, since schizophrenia is known to primarily impact cognition.
What did the researchers find?
In order to select human-specific evolutionary changes, we used the published list of 22 biological processes showing evidence of positive selection in terms of their mRNA expression levels in brain during recent human evolution [13]. Next, we tested whether expression of genes contained in these functional categories is altered in schizophrenia to a greater extent than expected by chance. To do this, we ranked 16,815 genes expressed in brain in order of probability of differential expression in schizophrenia, using data from a meta-analysis of 105 individuals profiled on 4 different microarray platforms in 6 independent studies [14]. We found that 6 of the 22 positively selected biological processes are significantly enriched in genes differentially expressed in schizophrenia (Wilcoxon rank sum test, p < 0.03, false discovery rate (FDR) = 11%), while only 0.7 would be expected to show such an enrichment by chance (Figure 1; Table S2 in Additional data file 1; Materials and methods). Strikingly, all six of these biological processes are related to energy metabolism. This is highly unexpected, given that there were only 7 biological processes containing genes involved in energy metabolism among the 22 positively selected categories (Figure 1; Table S2 in Additional data file 1). The mRNA expression changes observed in schizophrenia appear to be distributed approximately equally in respect to the direction of change, pointing towards a general dysregulation of these processes in the disease rather than a coordinated change (Table S3 in Additional data file 1).
Simply put: it’s metabolism. The brain eats up a lot of energy, about 20% of all the energy you take in as food. That’s a lot – for chimps the number is about 13%, and for other vertebrates it runs 2 – 8%. The conclusion:
In this study we find a disproportionately large overlap between processes that have changed during human evolution and biological processes affected in schizophrenia. Genes relating to energy metabolism are particularly implicated for both the evolution and maintenance of human-specific cognitive abilities.
Using 1H NMR spectroscopy, we find evidence that metabolites significantly altered in schizophrenia have changed more on the human lineage than those that are unaltered. Furthermore, genes related to the significantly altered metabolites show greater sequence and mRNA expression divergence between humans and chimpanzees, as well as indications of positive selection in humans, compared to genes related to the unaltered metabolites.
Taken together, these findings indicate that changes in human brain metabolism may have been an important step in the evolution of human cognitive abilities. Our results are consistent with the theory that schizophrenia is a costly by-product of human brain evolution [11,37].
When did this take place? From the LiveScience article first cited:
The extra calories may not have come from more food, but rather from the emergence of pre-historic “Iron Chefs;” the first hearths also arose about 200,000 years ago.
In most animals, the gut needs a lot of energy to grind out nourishment from food sources. But cooking, by breaking down fibers and making nutrients more readily available, is a way of processing food outside the body. Eating (mostly) cooked meals would have lessened the energy needs of our digestion systems, Khaitovich explained, thereby freeing up calories for our brains.
* * * * * * *
In Communion of Dreams, I posit the use of “auggies” – drugs designed to maximize the utilization of neurotransmitters in the brain. When combined with increased sensory information thanks to technology, an artificial kind of synesthesia occures, allowing for insights (artistic, cognitive) otherwise beyond human ability. But it is a cheat – you ‘burn up’ the available neurotransmitters quickly, accelerating brain function, but are left then less capable for a period of days after as the body replenishes. This is by and large a metabolic function – the same way an athlete can burn up energy stored in muscles in one brief period, but then needs time to recover.
I wrote this with an instinctive understanding of the mechanism involved – we’ve all experienced something akin to this phenomenon of pushing ourselves mentally for a short period, being tired and less able to think clearly afterwards. It’s a bit surprising to find that it may have literally been the same mechanism which lead to the rise of human intelligence to begin with.
And as for the alien artifact on Titan, which causes a similar phenomenon? Just coincidence that Prometheus was one of the Titans in Greek mythology.
No, really – just coincidence.
Jim Downey
*and yes, I realize that this isn’t quite what The Doors meant.
Filed under: Art, Artificial Intelligence, Augmented Reality, CNET, Expert systems, Heinlein, Kromofons, Predictions, Psychic abilities, Science Fiction, Society, Synesthesia, tech, Writing stuff
A friend sent me a link to a CNET news item from last week about how a new ‘color alphabet’ was going to revolutionize communications. From the article:
Lee Freedman has waited a long time, but he thinks the moment is finally right to spring on the world the color alphabet he invented as a 19-year-old at Mardi Gras in 1972.
For 35 years, between stints as a doctor, a real estate agent and a pizza maker at the Woodstock concert in 1994, Freedman has been working on Kromofons–an innovative alphabet in which the 26 English letters are represented solely by individual colors–waiting for technology to catch up with him.
And now, thanks to the Internet, the ubiquity of color monitors, Microsoft Word plug-ins and his being able to launch a Kromofons-based e-mail system, Freedman thinks he is finally ready.
Well, maybe.
Science fiction authors have used various tricks at evolving language and written communications, one of the most memorable for me being Heinlein’s Speedtalk from the novella Gulf. And working in other senses is a common tactic, up to and including extra-sensory perception (such as telepathy). This is part of the way I use synesthesia in Communion of Dreams: as a method by which the human brain can layer meaning and information in new ways, expanding the potential for understanding the world. It is noteworthy that many synesthetes will associate colors with a given word or even letter – it may be possible that Lee Freedman drew upon such an experience to create his color alphabet.
(An aside – I have experienced mild episodes of synesthesia upon several occasions. Sometimes these episodes have been induced by drugs, sometimes by intense concentration, sometimes of their own accord. I think that this is a latent ability everyone has, but not something which we usually access, because it is poorly understood by the general populace.)
Anyway, while Kromofons or something similar is certainly possible in the context of computer display (of almost any variety, including nano-tech paint) , there are some real limitations that I can see. First off, you wouldn’t want to have to have a full set of color pencils/markers and keep changing them in order to just write something down in the ‘real world’. Printed material of whatever variety would also be subject to degradation from light-fading: some pigments fade more quickly than others, some inks are more frail than others, some colors react to different lighting conditions in different ways. (Those are all problems I’ve experienced as a book & document conservator, as well as owning a gallery of art.) Even in the world of computer display, variations in lighting and equipment could render some colors ‘untrue’. Not to mention problems experienced by people as they age and color perception skews, or from the small but real percentage of the population which suffers from one type or another of color blindness. Sure, a good AI or expert system would be able to ‘translate’ for people who had such limitations, in the context of augmented reality, but that tech isn’t currently available except in its very infancy.
So, while I enjoy a slightly-nutty idea as much as the next person, and can see some ways that Kromofons could be used for fun, I don’t really see the idea going too far.
Jim Downey
