Razib Khan One-stop-shopping for all of my content

September 17, 2011

Out of Africa’s end?

The BBC has a news report up gathering reactions to a new PLoS ONE paper, The Later Stone Age Calvaria from Iwo Eleru, Nigeria: Morphology and Chronology. This paper reports on remains found in Nigeria which date to ~13,000 years B.P. that exhibit a very archaic morphology. In other words, they may not be anatomically modern humans. A few years ago this would have been laughed out of the room, but science moves. Here is Chris Stringer in the BBC piece:

“[The skull] has got a much more primitive appearance, even though it is only 13,000 years old,” said Chris Stringer, from London’s Natural History Museum, who was part of the team of researchers.

“This suggests that human evolution in Africa was more complex… the transition to modern humans was not a straight transition and then a cut off.”

Prof Stringer thinks that ancient humans did not die away once they had given rise to modern humans.

They may have continued to live alongside their descendants in Africa, perhaps exchanging genes with them, until more recently than had been thought.

In the broad outlines most people still seem to hold that within the last ~100,000 years there was a major demographic pulse which swept out of Africa and populated the rest of the world. Something special did happen. Oceania and the New World were settled by the descendants of anatomically modern humans, whom we can trace back to Africa. The key modifications to the old model seem to be two-fold:

1) The possibility of admixture with other lineages on the way out

2) The sublocalization of the “Out of Africa” scenario, and further admixture with lineages within Africa

There have long been debates about an East or South Africa ur-heimat for the first anatomically modern humans. Others are now even positing a North African origin! To a great extent I wonder if a West or Central African origin is forgone in part due to the paucity of fossil remains entailed by the unfavorable conditions for preservation.

However the details shake out the story seems to be getting more, not less, complicated. This makes for less pithy one liners for the media, but also more work for scientists. Figuring out stuff can be fun!

July 13, 2011

What one (or more) genomes can tell us

ResearchBlogging.orgThe Pith: We are now moving from the human genome project, to the human genomes project. As more and more full genomes of various populations come online new methods will arise to take advantage of the surfeit of data. In this paper the authors crunch through the genomes of half a dozen individuals to make sweeping inferences about the history of the human species over the past few hundred thousand years.

Since the integration of evolution and genetics in the early years of the 20th century there have been several revolutions in our ability to perceive the underlying variation which is the raw material and result of evolutionary genetics. The understanding that DNA was the concrete substrate of Mendelian genetics, and the rise to prominence of molecular genetic techniques in understanding evolution the 1970s and 1980s, was one key transition. No longer were geneticists simply tracking the coat colors of mice or the visible mutations of fruit flies. In the 1990s the uniparental loci, the maternal and paternal lineages as inferred from the mtDNA and Y chromosomes, came into their own. Finally, the 2000s saw the post-genomic era, and researchers routinely began analyzing ...

February 3, 2011

Dragon Bone Hill: An Ice-Age Saga of Homo erectus

Link to review: Dragon’s Battles

January 27, 2011

The scions of Shem?

The media is reporting rather breathlessly a new find out of Arabia which seems to push much further back the presence of anatomically modern humans in this region (more accurately, the archaeology was so sparse that assessments of human habitation seem to have been made in a vacuum due to absence of evidence). Here is the major objection:

This idea is at odds with a proposal advanced by Richard Klein, a paleoanthropologist at Stanford University, that the emergence of some social or behavioral advantage — like the perfection of the faculty for language — was required for modern humans to overcome the surrounding human groups. Some kind of barrier had to be surmounted, it seems, or modern humans could have walked out of Africa 200,000 years ago.

Dr. Klein said that the Uerpmann team’s case for an earlier out-of-Africa expansion was “provocative, but in the absence of human remains, it’s not compelling.

The stone tools of this era are all much alike, and it is hard to tell whether early modern humans or Neanderthals made them. At the sites of Skhul and Qafzeh in what is now Israel, early modern humans were present around 100,000 years ago and Neanderthals at 60,000 years, ...

After the evolutionary revolution

Image credit:

My post The paradigm is dead, long live the paradigm! expressed to some extent my befuddlement at the current state of human evolutionary genetics and paleoanthropology. After the review of the paper of possible elevated admixture with Neandertals on the dystrophin locus a friend emailed, “Remember when we thought everything would be so simple once we could finally see this stuff?” Indeed I do remember. The fact that things aren’t simple is very exhilarating, but it is also a major quash on theoretical clarity. Science is after all not a collection of facts, but it is in part facts which one can sieve through a analytic framework.

In hindsight with the relative robustness of ancient DNA results we can make some assessments about the role of human bias within particular heuristic frameworks over the past generation. From the mid-1980s up until 2000 it was victory after victory for the Out-of-Africa with total replacement model. The rise of mtDNA and Y chromosomal lineage studies seemed to buttress the idea of common descent from neo-Africans within the last 100-200,000 years for all human populations. There wasn’t much ...

December 28, 2010

The rise of the skulls!

Filed under: Anthroplogy,Human Evolution,Paleoanthropology — Razib Khan @ 1:55 pm

Neanderthal, La Chapelle-aux-Saints

Fossils matter. Fossils are evidence. That was Milford Wolpoff’s refrain in the 1987 NOVA documentary which heralded the long cresting of mitochondrial Eve and Out of Africa. Fossils remain highly relevant and important when it comes to deeper time phylogenetic relationships, but it does seem that they have only served to supplement the genetic data when it comes to recent human origins (e.g., calibrate and fine-tune molecular clocks). The paleoanthropologist Tim White, whose own position on human origins is at some contradiction from Milford Wolpoff’s, nevertheless felt the need to reiterate the relevance of fossils at a conference several years ago where most of the participants were geneticists (we received a preview of Ardi). Chris Stringer, who advocated for an Out of Africa model before Allan Wilson and his students roiled the academic waters often seems to have been relegated to nothing more than an adjunct to the molecular biologists in the public mind despite his priority. I think we are a turning point, and must acknowledge that recent human origins can no longer remain a one horse buggy. Genetics itself in the form of ancient DNA research, as well as more powerful analytic techniques utilizing larger autosomal data sets, have overturned and challenged the old conventional wisdom gleaned from trusting inferences derived from the patterns of variation of extant populations.

Consider two books from the early 2000s, The Seven Daughters of Eve: The Science That Reveals Our Genetic Ancestry, and The Real Eve: Modern Man’s Journey Out of Africa. Both were ambitious works drawing bold lines between the patterns of history and genes, disproportionately maternally transmitted mtDNA (ergo, the references to Eve). Even at the time they were works of hubris, mtDNA is one locus, tracing one long uninterrupted line of foremothers. But it misses the total genome variance, and may be subject to various biases. A decade on though we now have grounds to suspect that much of the story told in both works is false. Europeans may have a more complicated history than we could have imagined. Some of the assumptions behind the second book, that most of today’s genetic variation crystallized during the Last Glacial Maximum ~20,000 years ago, also seems likely to be false.

The genetic data no longer cohere together in a plausible integrated whole. And that is of course the beauty of science, that it is subject to revision and revolution, that it eats away at its own foundations on occasion when those foundations are wanting. The very tools of modern genetics have undercut the confidence in genetics as a whole to answer broad expansive historical questions on its own. This is not a flaw in genetic science, it is the strength of science generally. Unlike some systems of thought science does not rest upon timeless creeds and formulas.

So where now? We need to do more than give lip service to a multidisciplinary perspective. We need to embrace it. This means a new relevance and importance to those who know and can interpret the fossil record. It also means more attention to anthropological and historical patterns, which may indicate the probable sample space of genetic outcomes. It will be harder and more uncertain work than what has come before, but the results will hopefully exhibit closer fidelity to the reality that was. False certainty is worse than an admission of ignorance

Image Credit: Wikimedia Commons

October 26, 2010

‘dem bones tell strange tales

Filed under: Evolution,Genetics,Genomics,Human Evolution,Paleoanthropology — Razib Khan @ 12:16 am

There is a new paper in PNAS on remains from China which re-order and muddle our understanding of the emergence of anatomical and behavioral modernity in Eurasia. Human remains from Zhirendong, South China, and modern human emergence in East Asia:

The 2007 discovery of fragmentary human remains (two molars and an anterior mandible) at Zhirendong (Zhiren Cave) in South China provides insight in the processes involved in the establishment of modern humans in eastern Eurasia. The human remains are securely dated by U-series on overlying flowstones and a rich associated faunal sample to the initial Late Pleistocene, >100 kya. As such, they are the oldest modern human fossils in East Asia and predate by >60,000 y the oldest previously known modern human remains in the region. The Zhiren 3 mandible in particular presents derived modern human anterior symphyseal morphology, with a projecting tuber symphyseos, distinct mental fossae, modest lateral tubercles, and a vertical symphysis; it is separate from any known late archaic human mandible. However, it also exhibits a lingual symphyseal morphology and corpus robustness that place it close to later Pleistocene archaic humans. The age and morphology of the Zhiren Cave human remains support a modern human emergence scenario for East Asia involving dispersal with assimilation or populational continuity with gene flow. It also places the Late Pleistocene Asian emergence of modern humans in a pre-Upper Paleolithic context and raises issues concerning the long-term Late Pleistocene coexistence of late archaic and early modern humans across Eurasia.

I read the paper, and I really didn’t understand anything between the introduction and discussion. Mostly because it was a detailed exploration of anatomical details, and I’ve never taken an anatomy class. I basically rely on people like John Hawks to tell me what’s going on in that domain. He hasn’t blogged the paper (well, as of this writing), but he did give an assessment to National Geographic:

Still, the jaw and three molars were the only human remains retrieved from the Chinese cave, and the jaw is “within the range” of Neanderthal chins as well as those of modern humans, added paleoanthropologist John Hawks of the University of Wisconsin, Madison.

“If this holds up, we have to reevaluate” the human migration time line, he said.

“Basically, I think they’re right, [but] I want to see more evidence,” Hawks added. “I really, really hope that there can be some sort of genetic extraction from this [fossil].”

The issue of why this is relevant is covered well in the early portion of the paper:

…In eastern Eurasia, the dearth of diagnostic and well-dated fossil remains…has inhibited more than general statements for that region. Fully modern human morphology was established close to the Pacific rim by ∼40 kya, as is indicated by the fossils from Niah Cave in Sarawak…and especially Tianyuan Cave in northern China…The actual time of the transition has remained elusive, because the age of the latest known archaic humans in the region is substantially earlier…The eastern Eurasian age of the transition has been generally assumed to approximate that of western Eurasia (∼50–40 kya), although there have been claims supporting earlier dates for modern human presence in East Asia….

This scenario implies a long term (>100,000 y) restriction of early modern humans to portions of Africa with a brief ∼90 kya expansion into extreme southwestern Asia, followed by a relatively rapid expansion throughout Eurasia after ∼50 kya…The scenario also implies some form of adaptive threshold, roughly contemporaneous with the emergence of the Upper Paleolithic (sensu lato), and a marked behavioral difference between those expanding modern human populations and regional populations of late archaic humans (14).

It is in this context that three fragmentary human remains were discovered in 2007 at Zhirendong, South China…Because it is only well-dated diagnostic human remains that can document the timing and nature of human evolution and dispersal patterns (as opposed to archeological proxies for human biology or imprecise inferences from extant genetic diversity), the Zhirendong remains have the potential to shed light on these ongoing paleoanthropological issues.

jawOK, so the stylized orthodox model would be that anatomically modern humans arose in Africa ~200,000 years ago, and expanded out of Africa ~100-50,000 years ago. Full behavioral modernity emerged a bit later. In the broad outlines I think we can still go with this, but we have reached a level of fine-grained understanding of the evolutionary history of the human past that we need to consider adding detail to the margins of the story. The Denisova hominin, and perhaps H. floresiensis, are a good clue that the human family tree really was bushy, and that the past was filled with players unknown to us. The likelihood of Neandertal admixture also suggest that the other branches of the bush can’t be ignored and considered as without consequence for modern humans. There may be traces of other lineages in other human populations as well; there have long been claims based on inferences from some genetic data of modern populations, but the ability to compare to the Neandertal sequence gave those results from last spring particular credibility. But if the Neandertal admixture results become part of the consensus we should recalculate our probabilities of the other inferences.

linearSo how does it change things? One of the authors of the PNAS paper, Erik Trinkaus, has long made claims of hybridization from the fossil record, and this work falls in line with that tradition. The key here is the fossils seem to exhibit derived features, not ancestral ones. Derived features imply common ancestry of the populations which share the derived traits. If so, Trinkaus and company seem to be pointing to Alan Templeton’s “Out of Africa again and again.” On the other hand, I can’t but help think of Luke Jostins‘ plot of hominin cranial capacities as a function of time: separate lineages all seemed to be going on the same general path in terms of direction. I don’t make the claim here that H. sapiens sapiens was inevitable, but perhaps a common suite of traits which we associate with advanced hominin lineages, in particular the branch of which we are the terminus, were being selected for across the whole clade. In other words, perhaps anatomical modernity exhibits some element of convergent evolution, while behavioral modernity is the true hallmark of H. sapiens sapiens. It sounds crazy, and I don’t really believe it necessarily, but we live in crazy times. We had a neat and tidy story for 20 years between 1985 and 2005, but all good things have to end.

October 19, 2010

Did cavemen eat bread?


Food is a fraught topic. In How Pleasure Works Paul Bloom alludes to the thesis that while conservatives fixate on sexual purity, liberals fixate on culinary purity. For example, is it organic? What is the sourcing? Is it “authentic”? Obviously one can take issue with this characterization, especially its general class inflection (large swaths of the population buy what they can afford). Additionally, I doubt Hindus, Muslims and Jews who take a deep interest in the provenance, preparation, and substance of their food are liberals. What Bloom is noticing is actually a general human preoccupation which somehow has taken on a strange political valence in the United States. Somehow being conservative in this country has become aligned with a satisfaction with the mass-produced goods of the agricultural-industrial complex.* Some conservatives such as Rod Dreher have pushed back against this connotation, lengthily in his book Crunchy Cons.

Stepping away from politics, we are a diet obsessed culture broadly. Apparently Christina Hendricks is going on a diet, her aim being to lose 30 pounds. Diet fads come and go. The Atkins approach has faded of late, with the Paleolithic diet coming into fashion. A totally separate market segment, that of raw food, remains robustly popular. This was obvious when Richard Wrangham came out with Catching Fire: How Cooking Made Us Human; raw food enthusiasts would call in to talk shows where he was a guest, sometimes irritated that Wrangham was claiming that cooking was central to the emergence of modern humanity. His contention that raw food practitioners were healthy precisely because they don’t process as much of their nutritional intake because of the relatively coarse character of what they were consuming was clearly discomfiting to many of them. This is because it is at variance with some of the rationale for their diet. They are not cooking the food in part because they believe that that removes a great deal of nutritive value.

ResearchBlogging.orgI was thinking about this while reading What is Global History? Offhand the author mentions bread-making as early as 20,000 years go in the process of asserting that many of the preconditions for an agricultural mode of production were already in existence before the end of the last Ice Age. I was surprised by this fact, having never encountered it before. Unfortunately there wasn’t a footnote which I could follow up on, so I thought no more of it. Imagine my curiosity when I stumble upon this paper in PNAS, Thirty thousand-year-old evidence of plant food processing:

European Paleolithic subsistence is assumed to have been largely based on animal protein and fat, whereas evidence for plant consumption is rare. We present evidence of starch grains from various wild plants on the surfaces of grinding tools at the sites of Bilancino II (Italy), Kostenki 16–Uglyanka (Russia), and Pavlov VI (Czech Republic). The samples originate from a variety of geographical and environmental contexts, ranging from northeastern Europe to the central Mediterranean, and dated to the Mid-Upper Paleolithic (Gravettian and Gorodtsovian). The three sites suggest that vegetal food processing, and possibly the production of flour, was a common practice, widespread across Europe from at least ~30,000 y ago. It is likely that high energy content plant foods were available and were used as components of the food economy of these mobile hunter–gatherers.

One of the researchers on the team gave a good quote to Reuters:

“It’s like a flatbread, like a pancake with just water and flour,” said Laura Longo, a researcher on the team, from the Italian Institute of Prehistory and Early History.

“You make a kind of pita and cook it on the hot stone,” she said, describing how the team replicated the cooking process. The end product was “crispy like a cracker but not very tasty,” she added.

The contents of the paper are somewhat dry and opaque to me. The crux of the matter is that there are obviously important reasons why plant materials which may have been present in prehistoric camps aren’t preserved, so there has long been a bias in this area. It seems that the authors found a primitive system of pestle grinders, as well as flour grains. Below are the important figures which show their results:

The Reuters piece takes a shot at the Paleolithic diet:

The researchers said their findings throw humankind’s first known use of flour back some 10,000 years, the previously oldest evidence having been found in Israel on 20,000-year-old grinding stones.

The findings may also upset fans of the so-called Paleolithic diet, which follows earlier research that assumes early humans ate a meat-centered diet.

Ajlun_BreakfastI don’t know how the Paleo enthusiasts will react to this. I’m actually a guarded fan of Gary Taubes 2002 article in The New York Times Magazine What if It’s All Been a Big Fat Lie? I believe that a strong bias toward refined carbohydrates in your diet is bad for you. I generally don’t go as far as the Paleo enthusiasts in my own diet, but I have many friends who believe in the diet, and it works for them. That being said some of the Paleo people have an evangelistic aspect that probably is the source of shots like the ones above in the article. I am 5′8 and in the 140-150 range, usually in the 140-145 range. I’m not fat, and I’m not Paleo. My blood sugar levels are good. It can be done. Just because you were fat or or unfit and a particular diet works for you doesn’t mean everyone else has to follow the exact same prescription to the T. Human variation matters. South Asians have much higher propensities toward type 2 diabetes than other groups. It follows from that that everyone need not follow the nutritional and lifestyle guidelines of South Asians to get the same odds ratio of developing type 2 diabetes. How guilty you should feel about dessert is a function of the biological cards you bring to the table.

The importance of human variation, genetically and culturally, is relevant to this paper. What exactly does the likely presence of flour in three sites in Europe ~30,000 years ago tell us? Granting the validity of these results we can reject a strong form of the model of a Paleolithic diet which excluded processed starches. Does this now mean that Paleolithic humans were toasting pitas constantly around the fire? I don’t necessarily think so. We don’t know how pervasive this practice was. Human societies vary. Just because they were ancient, and “primitive,” does not mean that all Paleolithic populations were all the same. Second, it sees plausible that Paleolithic man was a generalist with a more diversified diet, all things equal, than his peasant successors. It may be that during the Paleolithic era there were no staples in a way we’d understand it today, rather, they subsisted on what was available at any given time. Perhaps these ancient pitas were reserve sources of sustenance which preserved well when other gathering and hunting had little or no yield. The difference between the Paleo-man and the peasant may then be thought of as the latter making what was once an emergency ration which was a good source of calories in a pinch into the staff of life.

A more general moral may be that we need to rethink our model of a Neolithic Revolution. It may have been a Neolithic Evolution.  After the last Ice Age there were at least two independent, and likely more than two, domestication events and shifts toward an agricultural lifestyle. In The Long Summer and After the Ice Brian Fagan and Steven Mithen both imply that the emergence of behavioral modernity during the last Ice Age set the stage for the inevitable shift toward agriculture with the climatic change. So was it was (warm weather) + (modern cognitive capacity) → agriculture? Perhaps. But almost certainly humans were developing skills and competencies over time up until the end of the Ice Age, and with the warmer conditions the switch toward more proactive and intensive cultivation of grains may have been a gradual process of escalation. As population densities began to rise it seems a model could be posited whereby a positive feedback loop was being generated; non-agriculture sidelines were less and less effective as larger populations supported by semi-agricultural lifestyles made greater demands on the local ecology. This may have meant that the shift toward obligate agriculture was inevitable once it became the only viable option. Once the ratchet moved forward there was no going back, and humans had entered a new epoch.

20080818jackfruitToday we live in a consumer age of plentitude. Or at least you live in a consumer age of plentitude if you’re reading this weblog on a computer. We have great choice in goods and services, and can have a wide range of experiences. The past was truly a strange and exotic place; as evidenced by the reality that pre-modern folk took high infant mortality for granted as an unfortunate fact of life, while we today see the death of an infant as a tragedy of the highest caliber. But we mustn’t oversimplify the past. In one episode of the 1989 television series Alien Nation the human protagonist is learning about the religious customs of his partner, an alien. Offhand he mentions his growing awareness to another acquaintance who is also an alien. He then asks if she will be celebrating a holiday he has just learned of, and she responds that she does not believe in that religion. The detective expresses great surprise that the aliens have different religions, to which she quips, “Don’t you?” The point is that the aliens are perceived as an amorphous mass, profoundly different from humanity, and their own internal distinctions are elided in the minds of humankind. And so I believe occurs with human societies of modes of production fundamentally different from our own. “Paleolithic humanity” becomes a type, all difference and variation removed from our conception. “Hunter-gatherer” is distilled down to an image of N!xau from The Gods Must Be Crazy. To get a better handle of how the world is and how it was we need to be careful of this cognitive bias.

Citation: Anna Revedin, Biancamaria Aranguren, Roberto Becattini, Laura Longo, Emanuele Marconi, Marta Mariotti Lippi, Natalia Skakun, Andrey Sinitsyn, Elena Spiridonova, & Jiří Svoboda (2010). Thirty thousand-year-old evidence of plant food processing PNAS : 10.1073/pnas.1006993107

* Just to be clear, I am not personally an unalloyed enthusiast for “natural” methods, whether it be organic or small-scale farming. Rather, I am pointing to the fact that agricultural subsidies have distorted and reshaped the nature of food production, distribution, and consumption. I see nothing fundamentally conservative about being sanguine about the power and influence of the agricultural-industrial lobby, and the corporations which exist in symbiosis with government largesse.

Image Credit: Wikimedia Commons, Serious Eats

Note: Since this post mentions diet I may get some crazy unhinged comments because I know that some people take their diets very seriously, and react harshly to deviationists from the Truth Path. If you have commenting privileges and lose control and post something inappropriate, I will delete it.

October 17, 2010

Völkerwanderung back with a vengeance


The German magazine Der Spiegel has a rather thick new article out reviewing the latest research which is starting to reintroduce the concept of mass folk wanderings into archaeology. The title is How Middle Eastern Milk Drinkers Conquered Europe. In the story you get a good sense of the recent revision of the null model once dominant within archeology that the motive forces of history manifested through the flow of pots and not people. This viewpoint came to ascendancy after World War II, and succeeded an older method of interpretation which presumed a tight correlation between race and culture. It repudiated the idea that the flow and change of pottery styles and extant patterns of linguistic dialects may have been markers for the waxing and waning of peoples.

Obviously a pots-not-people model had some major exceptions even during its heyday. The demographic explosion of European peoples after 1492, and especially the Anglo peoples after 1700, occurred within the light of history. Even if it hadn’t it would be ludicrous on the face of it to assert that the modern American population were derived from the indigenous populations, and that they had simply adopted the language, religion and folkways of the British conquerors of North America. But outside the presumed aberration of the European imperialist and colonial venture of the modern era the details on the ground were obscure enough that a model could be imposed from without.

No longer. In non-European societies such as China with extensive records it is clear that demographic increase and colonization were driving forces of the expansion of a given cultural domain. A neglect of this reality could only occur via ignorance of the primary documents. Plausible if one did not know Chinese. As in the case of Spanish conquest of the New World the demographic wave was not total; biological and cultural amalgamation with the native substrate south of the Yangtze did occur to produce a new synthesis. But the revision does not occur just through space, but time as well. The methods of genetics, whereby samples from ancient burials may be retrieved and compared to modern populations, have allowed us to reject the post-World War II assumption that the Etruscans were an indigenous Italian culture. Due to the lack of copious records a theoretical presupposition was able to interject itself into the data. When the story about Etruscan genetic relationships to Near Eastern groups broke in early 2007 I actually proceeded to skim the latest archaeological monographs on the archaeology of this group, and most of them had erudite expositions on exactly how the myriad distinctive aspects of Etruscan culture which suggested an exogenous origin were in fact derived from the antecedent Bronze Age societies of Tuscany.

Ancient DNA extraction is now allowing scholars to map the change in frequencies of genetic markers of archaeologically known groups as a function of both space and time more broadly. I think one can safely see that there are more perturbations, fluctuations, and turnovers, than any pots-not-people model could predict. On the specific issue of lactase persistence it is almost certainly a genetic novelty in Eurasia which arose over the last 10,000 years in a co-evolutionary fashion with animal husbandry. The genomes of modern Europeans suggest this, but we have confirmation from ancient DNA extraction as well. A rise in frequency of a particular allele does not entail a replacement of one population with another; lactase persistence today can exhibit a great deal of variance as a function of geography and ecology because its frequency no doubt responds to local selective pressures. But, in concert with other DNA data (mostly maternal lineages) as well as a fresh look at evidence of cultural discontinuity and rapid pulses of colonization in late Paleolithic and early Neolithic Europe, one must be open to the possibility that the spread of animal husbandry, copious raw milk consumption, and an aggressive and fecund population, were all concurrent processes which were tightly interlocked in some causal sense.

An aspect of this story which I am fuzzy and weak on is the archaeology. You likely know nearly as much about the Linear Pottery Culture as I do. It seems though that this cultural-complex brought agriculture deep into the heart of Central Europe ~7,000 years ago, and there are clear signs that its origin was to the southeast, from the Eastern Mediterranean region. L. L. Cavalli-Sforza’s ‘demic diffusion’ model which argued for the expansion of Neolithic farmers from the Middle East into Paleolithic Europe seemed to suggest that it occurred through a ‘wave of advance‘ impelled by endogenous population growth and gradual migration. The model as reported by Der Spiegel seems at some variance with this. Instead of a gradual advance it seems that there may have been periodic pulses and explosions of demographic advance. Using the historical examples we have this should not be particularly surprising. Overlain atop the reality of an inexorable push across the ‘frontier’ in both North America and China by the colonizing peoples I alluded to earlier it is important to remember that there were periodic punctuations of gradualism by bursts of mass colonization, displacement, and relocation. The migration out of overpopulated New England to the Great Lakes and Upper Midwest in the early 19th century, the retreat to the south by the Han Chinese after the collapse of their first dynasty and the conquest of the north by barbarians. Both of these are examples of the explosive process in demographics and migration which can revolutionize the cultural landscape within a generation or two. From the data that the archaeologists have collected this seems to have occurred with the expansion of agricultural society in Central Europe as well, as the Linear Pottery Culture and its descendants proceeded in fits & starts.

Unfortunately as this is the domain of prehistory we won’t ever know in concrete terms exactly what occurred. In the Der Spiegel piece they report that the agriculturalists had a 500 year pause. Why? If demic diffusion was the primary dynamic through which they expanded there should be no pause. But we can think of a host of scenarios. Perhaps the Middle Eastern cultural toolkit had reached its natural outer boundary, and it was here that the tide turned to the indigenous Paleolithic societies of Europe, which maintained an advantage in the north because of the lack of the adaptability of southerners to new conditions. Humans can be stubbornly conservative in their ways. It is famously asserted that the Norse of Greenland did not adapt to a more inclement regime, and so went extinct (or, possibly evacuated to Iceland). The adoption of potatoes and other productive and useful crops among European peasants was retarded by their instinctive conservatism. We need not imagine a scenario where Paleolithic hunters and gathers would naturally wish to take up the hoe. Nor is it plausible that the agriculturalists would wish to refashion their tried & tested traditions so as to push the outer boundary of their limes.

But all things must change. Something happened, and the agriculturalists shifted the modus vivendi, and the hunter-gatherers gave way. There are detailed historical processes which can give us insight as to how such long-held boundaries could rapidly collapse. Europeans had circumnavigated Africa by 1500, and had had factories and trading posts around the continent’s fringe for over 400 years by the late 19th century. But deep into the 1800s the European presence in Africa was marginal. By 1914 the continent was divided into European zones of control. What happened in the space of a few decades? Quinine and machine guns. The biological barrier to Europeans fell away, and the military superiority was amplified by orders of magnitude. What could have occurred in an analogous fashion in Central Europe? The combination of a mutation for lactase persistence and animal husbandry may have resulted in rapid population growth, leading to densities which precipitated the outbreak of an epidemic. A reduction in population may have had much greater impact on the less numerous and resistant hunter-gatherers. Additionally the economic changes wrought by animal husbandry may have allowed for a scaling up of the organization of war. The Mongol Empire exploded onto the scene in mere decades to sweep across most of Eurasia. Why couldn’t something similar plausibly have occurred in Central Europe on a much smaller scale 7,000 years ago?

Ultimately we’ll never know the details. But in constructing our plausible scenarios for prehistory I suspect that we moderns have a bias toward viewing pre-literate societies as usually small-scale, at best simple chiefdoms. I believe this is a false model, and that there was a non-trivial level of scalability possible among pre-literate societies. Going back to the Mongol example, I see no reason why the initial existence of their polity necessitated literacy, though it may have been essential for its administration and perpetuation. Cultural forms likely marched with confederacies and were driven forward by warlords. This would easily explain the punctuated pattern of the spread of agriculture, as the rise and fall of states has a somewhat spasmodic and periodic character.

Finally, I want to emphasize that the Der Spiegel piece verges on a maximalist position which I am not comfortable with. There is much we don’t know, and I am in no hurry to replace one tired and dogmatic orthodoxy with another. Because the article was translated from German into English I can understand that that is responsible for the artlessness of some of the assertions. But phrases such as “There was no interbreeding between the intruders and the original population” from a German magazine really makes me think they’re suggesting that there were Stone Age Nuremberg Laws. Ethnic separation and differentiation was a reality among many ancient peoples, but so was intermarriage and assimilation. I am aware of the starkness of some of the DNA analyses, which suggest disjoint frequencies across the two populations, but the results are far too spotty at this point to make definitive assertions.

Note: The accompanying map is worth perusing.

(referral credit, Steve Sailer)

Image Credit: Wikimedia Commons

September 9, 2010

The naked years

manchangeWhen I talk about sexual selection I usually make sure to have an accompanying visual of a peacock to go with the post. But really I could have used a dandy as an illustration, or perhaps in our day & age “The Situation”. Unlike the peacock much of what passes for human “plumage” is not a result of native biological processes, but rather refashioning the materials of other organisms or synthetics into a sort of second skin (or skins with all our layers). In other words, clothes. These artificialities are so essential to our own identity as individuals that they often mark out our tribal affiliations, in pre-modern and post-modern contexts. Whole industries exist to cater to both our utilitarian needs and aesthetic sensibilities in regards to how we dress ourselves. The definition of a cyborg usually connotes a synthesis of biological with electronic. Perhaps that is because our artificial extensions in the form of clothes have seamlessly merged with our self-images, to the point where it would be ludicrous to perceive ourselves as merged entities. If you encountered many of your acquaintances or friends naked not only would embarrassment ensue, but I suspect one might initially not recognize them. A naked physique without distinctive aspects of clothing one associates with someone strips away individual identity.

ResearchBlogging.orgBut clothing has not been the eternal condition of man, recall that Eve met the fig leaf after an unfortunate sequence of events. In all likelihood our common ancestor with bonobos and chimpanzees were predominantly hirsute, as are most mammals. A mammal without fur is like a fish without scales and birds without feathers. Not impossible, but atypical. But at some point we did lose our fur. When? A 2004 paper offered up an intriguing possibility, that ~1.2 million years ago our lineage became hairless. How did they come to this inference? The authors noted that the consensus sequence of the MC1R locus in humans among dark skinned peoples  coalesced back to this period (i.e., the last common ancestor of the MC1R genes which exhibit the ancestral type, which confers dark skin). Once our ancestors lost their fur then they would have been exposed to solar radiation, and so the necessity of dark skin. When did this naked dark ape cover his shame? (yes, I censored one of the images above to make this post “work safe” above the fold) A new paper in Molecular Biology & Evolution offers up a precise date using another proxy, Origin of clothing lice indicates early clothing use by anatomically modern humans in Africa:

Clothing use is an important modern behavior that contributed to the successful expansion of humans into higher latitudes and cold climates. Previous research suggests that clothing use originated anywhere between 40,000 and 3 million years ago, though there is little direct archaeological, fossil, or genetic evidence to support more specific estimates. Since clothing lice evolved from head louse ancestors once humans adopted clothing, dating the emergence of clothing lice may provide more specific estimates of the origin of clothing use. Here, we use a Bayesian coalescent modeling approach to estimate that clothing lice diverged from head louse ancestors at least by 83,000 and possibly as early as 170,000 years ago. Our analysis suggests that the use of clothing likely originated with anatomically modern humans in Africa and reinforces a broad trend of modern human developments in Africa during the Middle to Late Pleistocene.

The evolution-by-lice method has some pretty simple logic behind it. Our parasitic lice have followed us across our journeys. In this case there are two forms being spotlighted, one of the head, and another of the clothes. The genesis of the latter can give us a clue as to the period of time when our species began to be obligate clothed. One presumes that the head lice were the ancestral form, and that the clothing lice derived from them (the head lice took refuge there after we lost our fur). Standard phylogenetic methods which are applied across a range of taxa can then be turned to these two lice populations. When the clothing lice diverge from the ancestral line of head lice it stands to reason that humans must have been wearing clothing. A species which emerges to fill a niche, must have a niche to fill. Eventually the authors arrive at a figure of ~170,000 years before present.

To get from here to there, they looked at four loci: three nuclear genes, 18S rRNA, nuclear elongation factor 1-α (EF-1α), and RNA polymerase II (RPII), and the mitochondrial gene cytochrome c oxidase subunit I (COI). In the age of SNP-chips with hundreds of thousands of markers this may seem paltry, but the questions are far coarser in this case. The authors weren’t looking to generate PCA plots showing the relationship of various disparate head lice lineages or anything so fine-grained. Rather, they were focusing on the speciation of the two lineages from a common ancestor, an archaic head lice population. This is a picture which can be painted in broad strokes, concretely, four genetic strokes.

They found that the ancestral lice went through some sort of bottleneck, likely mirroring their hosts. That was followed by a demographic expansion. Again, mirroring the hosts. Surprisingly the gene flow seems to have gone predominantly from the clothing lice to the head lice! Previous studies using a set of different markers, specifically microsatellites, did not find that the two populations evidenced gene flow after their separation. Additionally, as noted the direction of gene flow is somewhat surprising as the clothing lice population clearly derive from the head lice. This peculiarity makes me think of Sewall Wright’s shifting balance model, whereby population structure and historical events are critical in shaping contingent arcs of evolution.

Figure 1 illustrates their model in the context of prior data:

Using the genetic data, a Bayesian coalescent model with assumptions of migration, population structure and mutation rates, they generated a distribution of the probable time when head and clothing lice went their (generally) separate ways. The gray line follows the arc of the distribution, with ~83,000 years before present as the mode, and ~170,000 years before the present as the median. Such a probability density distribution seems to imply that clothing lice lineages which we have today emerged at least before the expansion of modern humans out of Africa. As anatomically modern humans emerged ~200,000 years ago in Africa there is a strong probability that clothing in Africa dates back to the archaic lineages.

If the paper cited above in regards to MC1R coalescence is valid, these results indicate a period of “naked years” between the loss off fur by H. erectus (I believe H. ergaster in Africa), and the adoption of clothing by modern humans or some archaic group. The authors correctly note that they can not rule out that other human lineages, such as Neandertals, wore clothing and had their own lice lineages. Despite the evidence of Neandertal admixture, it seems that the lice results are in alignment with the mtDNA, and imply total replacement. Just as Gallo-Roman nobles began donning Frankish trousers perhaps the Neandertals assimilated into modern human bands tossed aside their barbaric capes and took up the clothes of civilized folk with vaulted arches.

Despite their claims of more sophisticated methodology we probably should be a touch cautious about these results. Some of the findings are weird (the gene flow from clothing to head lice) and conflict with earlier work (finding gene flow in the first place). The parallels between lice & men in terms of evolutionary history are both striking and suggestive, but lice are lice, and they may have their own wily ways. And let’s not forget the pubic lice, which tell a different set of stories.

Citation: Melissa A. Toups, Andrew Kitchen, Jessica E. Light, & David L. Reed (2010). Origin of clothing lice indicates early clothing use by anatomically modern humans in Africa Mol. Biol. Evol. : doi:10.1093/molbev/msq234

Acknowledgements: Thanks to Dienekes to the paper pointer.

Image Credit: Wikimedia – Australopithecus & Kemal Ataturk, Anthropologyinfo.com – H. erectus

August 31, 2010

The New World in three easy steps

Filed under: Anthroplogy,Archaeology,Human Expansion,New World,Paleoanthropology — Razib Khan @ 9:28 am

One aspect of human demographic expansions seems to be the fact that we often model them as a constant diffusion process, when in reality there were likely pulses (economic historians can conceive of this as the periodic gaps between land and labor factor inputs). I don’t know much about the human movements prior to H. sapiens sapiens, and from what I can gather the fossil remains are too sparse to be too wedded to a specific model, but it seems clear that anatomically modern human expansion occurred through a series of rapid outward sweeps which would periodically reach a “natural barrier.” Modern humans reached the Solomon Islands ~30,000 years ago, after which there was stasis for ~25,000 years. Only with the Austronesian expansion did humanity push past the Solomons. And this was no baby-step, ultimately the Austronesians went as far as the Hawaiian islands and Easter Island.

The New World is similar. The initial migration out of Africa by modern humans resulted in the range expansion of the human lineage into a region which had been untouched by earlier hominins, Australasia. But after that point tens of thousands of years passed before our species pushed into virgin territory, in this case North America. The when and the how of this though is still up for debate. A new paper PLoS One attempts to construct a plausible scenario by taking archaeological data points and inputing them into a diffusion model. Archaeological Support for the Three-Stage Expansion of Modern Humans across Northeastern Eurasia and into the Americas:

We use diffusion models…to quantify these dynamics. Our results show the expansion originated in the Altai region of southern Siberia ~46kBP , and from there expanded across northern Eurasia at an average velocity of 0.16 km per year. However, the movement of the colonizing wave was not continuous but underwent three distinct phases: 1) an initial expansion from 47-32k calBP; 2) a hiatus from ~32-16k calBP, and 3) a second expansion after the LGM ~16k calBP. These results provide archaeological support for the recently proposed three-stage model of the colonization of the Americas….Our results falsify the hypothesis of a pre-LGM terrestrial colonization of the Americas and we discuss the importance of these empirical results in the light of alternative models.

It’s an interesting paper because it seems to have been triggered in part by inferences made from the genetic data. I don’t know how confident archaeologists are about their radiometric dates, but I think some of the molecular clock results from the genetics of Amerindians need to be taken with a grain of salt (I don’t see many people repeating some of the really ancient coalescence dates for Amerindian Y lineages at this point).

These data seem to indicate that modern humans made it no further than previous hominin groups for several tens of thousands of years. But something happened within the last 20,000 years, and our species made the leap across Beringia. The bottleneck here is certainly not the Bering Strait, which was spanned by land much of the time in any case. Rather, our species didn’t have the biological or cultural capacity to survive in extremely frigid environments. I’ve read modern humans pushed the boundaries of their range in northern Europe further than Neandertals ever did, indicating our flexibility and plasticity. Since the human lineage had been resident in Eurasia for at least one million years that suggests to me that it was behavioral modernity that was key. In particular, how quickly our cultures evolve and shift. Though that flexibility itself may be a function of our biological competencies.

June 20, 2010

“Here be dragons”

Filed under: Folk,Human Evolution,Myth,Paleoanthropology — Razib Khan @ 8:21 am

I just stumbled onto two amusing articles, Ancient legends once walked among early humans?, and The discovery of material evidence of a distinct hominin lineage in Central Asia as recently as 30,000 years ago is no surprise. The second is a letter from a folklorist:

Sir, The discovery of material evidence of a distinct hominin lineage in Central Asia as recently as 30,000 years ago (report, Mar 25) does not come as a surprise to those who have looked at the historical and anecdotal evidence of “wild people” inhabiting the region. The evidence stretches from Herodotus to the present day. The Russian historian Boris Porshnev suggested that they are relict Neanderthals, although the lack of evidence of material culture suggests a type closer to Home erectus.

Needless to say many are skeptical of folk memories persisting for 30,000 years, though a standard assumption in paleontology is that the earliest and last fossil find of any given species is going to underestimate their period of origin and overestimate the period of extinction. In other words the Denisova hominin lineage almost certainly persisted more recently than 41,000 years ago. But recently enough to spawn legends of Enkidu? I’m skeptical. Someone with a better grasp of the mutation rate in oral history can clarify, but it seems that tall tales would be so distorted over a few thousand years that the initial kernel of truth would quickly be obscured.

Here’s my model for why almost all cultures have tales of various semi-human groups: cross-cultural differences are stark enough that it isn’t too hard to dehumanize other populations. More specifically, I think the biggest gap is going to be between groups who practice different modes of production. Many of the “wild people” as perceived by agriculturalists were probably just marginalized hunter-gatherers who hadn’t taken up the ways of “humans.” Consider how many upper middle class white Americans perceive rural people from Appalachia even in our enlightened age. There are even biological differences, as agricultural populations seem smaller and more gracile in comparison to hunter-gatherers (who consume more fibrous food stuffs, and probably have a more balanced nutritional intake). How hard is to conceive of a small and malnourished agriculturalist being cowed by more robust hunter-gatherer group upon first contact?*

Combine real cultural and biological differences with human imagination, and it seems that this is the most likely explanation for the universality of wild people and strange semi-human folk. It is in other words simply an aspect of evoked culture, nothing that needs special triggers in the form of other human lineages. The main exception I can think of would be Flores Hobbits, who may have persisted down to a very recent period.

* The immediate objection to this possibility is that hunter-gatherer groups tend to get sick very quickly with the approach of high density humanity, and already pushed to less productive land by the time they’re confronting the agriculturists on a daily basis. So they are less likely to be robust.

March 29, 2010

Thomas Malthus was right. Mostly

pleistocene_brain_sizeJohn Hawks has an excellent post rebutting some misinformation and confusion on the part of Colin Blakemore, an Oxford neurobiologist. Blakemore asserts that:

* There was a sharp spike in cranial capacity ~200,000 years ago, on the order of 30%

* And, that the large brain was not deleterious despite its large caloric footprint (25% of our calories service the brain) because the “environment of early humans was so clement and rich in resources”

Hawks refutes the first by simply reposting the chart the above (x axis = years before present, y axis = cranial capacity). It’s rather straightforward, I don’t know the paleoanthropology with any great depth, but the gradual rise in hominin cranial capacity has always been a “mystery” waiting to be solved (see Grooming, Gossip, and the Evolution of Language and The Mating Mind: How Sexual Choice Shaped the Evolution of Human Nature). Blakemore may have new data, but as they say, “bring it.” Until then the consensus is what it is (the hominins with the greatest cranial capacities for what it’s worth were Neandertals, and even anatomically modern humans have tended toward smaller cranial capacities since the end of the last Ice Age along with a general trend toward smaller size).

But the second issue is particularly confusing, as Blakemore should have taken an ecology course at some point in his eduction if he’s a biologist (though perhaps not). One of the problems that I often have with biologists is that they are exceedingly Malthusian in their thinking, and so have a difficult time internalizing  the contemporary realities of post-Malthusian economics (see Knowledge and the Wealth of Nations: A Story of Economic Discovery).Innovation and economic growth combined with declining population growth have changed the game in fundamental ways. And yet still the biological predisposition to think in Malthusian terms is correct for our species for almost its whole history.*

A “tropical paradise” is only a tropical paradise if you have a modicum of affluence, leisure, and, modern medicine. Easter Island is to a great extent a reductio ad absurdum of pre-modern man and gifted with a clement regime. Easter Island’s weather is mild, the monthly low is 18/65 °C/°F and the monthly high is 28/82 °C/°F. The rainfall is 1,118/44 mm/in. But constrained on an island the original Polynesians famously transformed it into a Malthusian case-study. We literally breed up to the limits of growth, squeezing ourselves against the margins of subsistence.

I can think of only one way in which Blakemore’s thesis that the environment of early humans was rich in resources might hold, at least on a per capita basis: the anatomically modern humans of Africa exhibited bourgeois values and had low time preference. In other words, their population was always kept below ecological carrying capacity through forethought and social planning, since there is no evidence for much technological innovation which would have resulted in economic growth to generate surplus. My main qualm with this thesis is that it seems to put the cart before the horse, since one presupposes that a robust modern cognitive capacity is usually necessary for this sort of behavior.

* Malthus’ biggest mistake was probably that he did not anticipate the demographic transition, whereby gains in economic growth were not absorbed by gains in population.

March 27, 2010

The Mysterious Other

Last week Nature published a paper which may have found a new ‘branch’ of the hominin evolutionary bush which may have been coexistent which modern humans and Neandertals. I recommend The Atavism, Carl and John Hawks on this story. Interesting times.

March 24, 2010

Others in Siberia?

Filed under: Human Evolution,Paleoanthropology — Razib @ 11:07 am

The complete mitochondrial DNA genome of an unknown hominin from southern Siberia:

With the exception of Neanderthals, from which DNA sequences of numerous individuals have now been determined…the number and genetic relationships of other hominin lineages are largely unknown. Here we report a complete mitochondrial (mt) DNA sequence retrieved from a bone excavated in 2008 in Denisova Cave in the Altai Mountains in southern Siberia. It represents a hitherto unknown type of hominin mtDNA that shares a common ancestor with anatomically modern human and Neanderthal mtDNAs about 1.0 million years ago. This indicates that it derives from a hominin migration out of Africa distinct from that of the ancestors of Neanderthals and of modern humans. The stratigraphy of the cave where the bone was found suggests that the Denisova hominin lived close in time and space with Neanderthals as well as with modern humans….

The tree gets bushier? Just see John Hawks and Carl Zimmer.

Powered by WordPress