In “Ten Billion” Stephen Emmott refers at least once to a dramatic increase in the pace at which species are becoming extinct, mentioning “a rate one thousand times faster than the normal evolutionary rate as we consume the planet’s resources”  and attributing it to human population growth (via habitat loss) and also to climate change.
His message of impending catastrophe for life on Earth is consistent with much of what we routinely hear nowadays from scientific bodies and political entities such as UNEP, the United Nations Environment Programme. An announcement in 2012  from experts gathering at the University of Copenhagen to discuss the formation of a new UN Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES), is typical, in that regard. Biodiversity, they tell us, is declining rapidly throughout the world and the pace of species extinction is “accelerating”.
“The world is losing species at a rate that is 100 to 1000 times faster than the natural extinction rate.”
“Now we are in the 6th mass extinction event, which is a result of a competition for resources between one species on the planet – humans – and all others. The process towards extinction is mainly caused by habitat degradation, whose effect on biodiversity is worsened by the ongoing human-induced climate change.”
If the extinction rate is accelerating in the way these experts say it is, then this is bad news indeed. However, trying to establish exactly how many and which species have gone extinct since the start of the 21st century, for example, can be a confusing exercise. A website called “The Sixth Extinction” , lists seven animal species that have vanished since 2000: four mammals, one fish, one mollusc – and one bird. Birdlife International , on the other hand, lists three species of birds going extinct during this period (Spix’s Macaw, the Hawaiian Crow and the Po’ouli.) The authority in these matters is IUCN (International Union for Conservation of Nature) who maintain their Red List of endangered species, and in 2004 they published a Global Species Assessment  which lists 10 animals and 5 plants recorded as having become extinct between 1984 and 2004, and a further 5 animals and 7 plants that have become extinct in the wild, during that period (they list three birds as extinct – the Kauai O’o, the Kama’o and the Atitlan Grebe – and three others as extinct in the wild, including the Hawaiian crow.)
The IUCN themselves admit that it is often difficult to state whether a species is extinct or not. “Even where assessments have been conducted it can take years or decades to prove that a species is truly Extinct. The basic paradox of “documenting” extinctions is that absence of evidence is not necessarily evidence of absence (Stine and Wagner in press).”  So arriving at hard and fast numbers is not a very easy task at all – maybe even impossible. Even so, given that known animal extinctions since 2000, for example, are probably in single figures it is difficult for the layperson not to wonder whether there should not have been many more of them during this period, if extinctions have indeed accelerated to a rate up to a 1000 times faster than previously.
One way that biologists measure the rate of extinction from habitat loss is by reversing something called the species-area relationship or the species-area curve, which can be expressed as a mathematical formula (S = cAz, where S is the number of species, c is a constant, A is the habitat area and z is a variable) and which was first proposed by Arrhenius in 1920. The formula describes the number of species we can expect to encounter in a given area; a straightforward way to envisage it is to imagine that a tenfold increase in habitat area will result in about a doubling of the number of species, and thus when a habitat shrinks to one tenth its original area its plants and animals will decline by roughly a half.
Actually this is one of three ways of measuring extinctions that are listed by Harvard biologist E.O. Wilson in his 2002 book “The Future of Life”. In addition to the reversed species-area relationship (which he describes as the most commonly employed measure) Wilson describes a second method, which involves tracking the status of individual species through IUCN’s Red List as they descend through “vulnerable” to “endangered” to “critically endangered” to “extinct”. In addition, he describes a third method which is to apply something called Population Viability Analysis to the threatened species in the Red Lists – in effect to work out the survival probability of the species.
Wilson states that these are independent measures and that they all nevertheless arrive at a similar rate of extinction – one thousand to ten thousand species per million per year – and that they are “persuasively consistent”.
Wilson also describes the rates of extinction that occurred in Edenic times (the period from 545 million years ago up to 50,000 – 10,000 years before the present day) as being on average one species per million each year . This is actually a very imprecise figure, as the rates for different kinds of creatures (for example mammals and echinoderms) vary a lot, and over many millions of years there have also been a number of mass extinctions followed by periods of rapid evolution, as species quickly fill vacant ecological niches. However, one species per million per year is the official average natural rate of extinction.
To make things even more complicated, of course, it does not help that we still have a great deal of uncertainty about how many species there currently are in the world, anyway. As of 2011 there were 2.1 million known species catalogued in a central database , but according to one article published in the scientific journal PLoS Biology that year, there might be up to an estimated 8.7 million species in total . We also constantly keep discovering new animals and plants – the International Institute of Species Exploration in Arizona states that over 18,000 new species are officially described each year .
However, even taking the very conservative total figure of 2.1 million species in the world, if the measuring techniques listed by Wilson are accurate it would mean that – very, very approximately – 2100 to 21,000 of them are disappearing every single year. To the layperson these large numbers might seem incredible and perplexing, especially when compared to the handful of documented species that we know have vanished over the last decade or so. The discrepancy is striking.
Given the puzzling lack of actual, recorded extinctions, can the current orthodoxy be questioned – challenged, even? It appears the answer is yes: it can.
In 2011, Craig Loehle, an ecologist, and Willis Eschenbach, a writer and polymath, had a paper “Historical bird and terrestrial mammal extinction rates and causes” published in Diversity and Distributions, a journal of conservation biogeography . In it, they examined the historical extinction rates for birds and mammals, and also contrasted island and continental extinctions (Australia was counted as an island because of its isolation and because its extinctions were largely from the same cause as island extinctions.)
They measured extinct species, rather than sub-species and also measured absolute extinctions, rather than local extinctions (“extirpation”) or where a species was listed as extinct in the wild. They focussed on mammals and birds, as these taxa are better studied and catalogued than others (E.O. Wilson had also written in 1995: “Some groups, like the larger birds and mammals, are more susceptible to extinction than most” which if accurate would make them ideal subjects for this exercise. )
What they found is that when island and continental species were considered separately, standard databases only recorded six continental mammals and three continental birds as having become extinct since the year 1500, as opposed to 123 bird species and 58 mammal species on islands. In addition, the main cause of island extinctions (and therefore of most extinctions) is not habitat loss but predation by humans and by alien species introduced by humans. These findings have some important implications.
1) If Loehle and Eschenbach are right, the reversed species-area relationship is not an accurate measurement of extinctions (except in the case of islands), which means that the relationship is not an accurate predictor of extinctions on continents, and that habitat loss (e.g., deforestation) generally leads to fewer – maybe far fewer – extinctions than previously thought.
2) Most extinctions have occurred on islands and have been due to predation by humans and other interlopers. These extinctions are actually becoming fewer, simply because we have run out of remote islands to discover – large numbers of island species were wiped out in the wake of Polynesian and European colonisation of the Pacific, for example, and this special set of circumstances will not be repeated.
3) If habitat loss causes fewer extinctions than first thought, then the effects on extinction rates of climate change and human population growth via habitat degradation will be correspondingly less than first thought.
4) Loehle and Eschenbach found that on islands, the mammal extinction rate was between 82 and 702 times the background rate, and for birds it was between 98 and 844 times the background rate; this corresponds roughly to the received wisdom of “100 to 1000 times faster than the natural extinction rate”. However, they also found that on continents the mammal extinction rate was only between 0.89 and 7.4 times the background rate, and for birds it was only between 0.69 and 5.9 times the background rate.
In other words, aside from island habitats, where mammals and birds have been especially vulnerable, they found that historic and current extinction rates are not significantly higher than the background rate, and thus a statement like “The world is losing species at a rate that is 100 to 1000 times faster than the natural extinction rate” is an exaggeration.
Another study that criticised the species-area relationship was published in Nature, also in 2011. In their paper “Species–area relationships always overestimate extinction rates from habitat loss”, ecologists Fangliang He and Stephen P. Hubbell argued, basically, that the relationship was a very inaccurate guide to extinctions, overestimating them by up to 160% or more . Simply put, the rate at which new species are found in a given area is usually much higher than the rate at which they will become extinct as their habitat is destroyed, and so reversing the relationship to estimate extinctions will not provide accurate results – in fact, will grossly overestimate them.
The Sinking Ark
There is another oft-repeated statement that historically has been linked with the “100 to 1000 times faster than the natural extinction rate” statement. And that is the assertion that between a fifth and a half of all species will shortly become extinct, either due to habitat destruction, human overpopulation, climate change or a mixture of one or more of these.
In recent years, the blame for this threat to biodiversity has been squarely placed at the door of man-made climate change; Dr R K Pachauri of the IPCC mentions it, for instance, in a presentation to the UN in 2007: a bullet point states tersely “20-30% of plant and animal species at risk of extinction” .
This estimate relies heavily on a 2004 study published in Nature called “Extinction risk from climate change” by a number of authors led by biologist Chris D Thomas, who predict “on the basis of mid-range climate-warming scenarios for 2050, that 15 – 37% of species in our sample of regions and taxa will be ‘committed to extinction’ .
The BBC reported it like this, at the time: “Climate change could drive a million of the world’s species to extinction as soon as 2050, a scientific study says. The authors say in the journal Nature a study of six world regions suggested a quarter of animals and plants living on the land could be forced into oblivion” .
This study has since been criticised heavily by other scientists for a number of reasons, including the use of unproven methodology, circular mathematical reasoning, working from basic assumptions that could well turn out to be wrong and because the species sampled were of populations existing in small geographic ranges. Donna Laframboise has written a very good article called “Extinction Fiction” describing and summarising the Thomas paper and its critics .
However, the idea that an alarmingly large percentage of species will have disappeared in a few decades’ time, goes back long before the IPCC and the Thomas paper.
In 1989 the National Science Foundation published a report called “Loss of Biological Diversity: A Global Crisis Requiring International Solutions” . The following is from the report’s prologue, which reads as though it could just as well have been written in 2012:
“The extinction event that we are witnessing is the most catastrophic loss of species in the last 65 million years. Most importantly, it is the first major extinction event that has been caused by a single species, one that we hope will act in its own self interest to stem the tide.
Unless the international community can, indeed, reverse the trend, the rate of extinction over the next few decades is likely to rise to at least 1000 times the normal background rate of extinction, and will ultimately result in the loss of a quarter or more of the species on earth.”
Thus we find the “1000 times the normal background rate” idea and the “loss of a quarter or more of the species on earth” idea in the very same sentence. The report refers back to several sources – a book by Paul and Anne Ehrlich, another book edited by B J Norton, and a reference to “Lovejoy, 1980″. “Lovejoy 1980″ is part of a study by Thomas Lovejoy that was referenced by the Global 2000 Report to the President, which was a major environmental/developmental study presented to US President Jimmy Carter in 1980 .
The Global 2000 Report states:
“An estimate prepared for the Global 2000 Study suggests that between half a million and 2 million species – 15 to 20 percent of all species on earth – could be extinguished by 2000, mainly because of loss of wild habitat but also in part because of pollution. Extinction of species on this scale is without precedent in human history” .
The late economist Julian Simon  traced this reference via Thomas Lovejoy back to a book called The Sinking Ark by Norman Myers in 1979 . Norman Myers, an influential environmental scientist, is of course, also known for his estimate that by the year 2010 there could be 50 million environmental refugees, a figure he later admitted was arrived at through some “heroic extrapolation”.
In his book, Myers describes a conference in 1974, where scientists “hazarded a guess” that the rate of species extinction had risen to 100 per year. Myers then writes:
“Yet even this figure seems low… Let us suppose that, as a consequence of this man-handling of natural environments, the final one-quarter of this century witnesses the elimination of 1 million species – a far from unlikely prospect. This would work out, during the course of 25 years, at an average extinction rate of 40,000 species per year, or rather over 100 species per day.”
For further reference, I recommend reading the excellent Julian Simon and also a fine article called “The Teflon doomsayers” by writer and historian Stephen Budiansky, who very eloquently describes the haphazard and arbitrary nature of these 30 year-old estimates .
To sum up, when Stephen Emmott refers to an extinction rate “one thousand times faster than the normal evolutionary rate as we consume the planet’s resources”, there are a few things to be borne in mind which may help the reader to be healthily skeptical of this claim.
1) We have no exact idea how many species there are on Earth – the estimated total varies from between 2.1 to 8.7 million, and thousands of new ones are discovered each year.
2) The official “background rate of extinctions” – one species per million per year – is itself actually a huge approximation.
3) It is unclear exactly how many animal and plant species have definitely become extinct in recent years, although the evidence points to fewer than 10 animal species since the start of the 21st century.
4) The main method for calculating extinction rates from habitat loss is probably highly inexact and is best suited to island habitats, where the vast majority of historical extinctions have occurred. However, these extinctions have mostly been caused by human and animal predation, not habitat loss per se. Professor Emmott argues that human overpopulation and man-made climate change (rather than hunting, for instance) will lead to extinctions occurring a thousand times the normal rate; however, the historical evidence suggests otherwise.
5) Recent predictions of accelerated extinction rates and large percentages of the world’s species vanishing are a striking match to similar predictions made over 30 years ago and which appear to have been based on little more than conjecture at the time. It would be tempting, although perhaps rather cynical, to surmise that the alarming statements came first – clarion calls, as it were, for action and funding – and that the inherent uncertainties of the science are sufficiently large to have made it easy for studies to validate them ever since.
Last year (2012) saw the formation of a new organisation, the UN Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES) . Professor Stephen Emmott and his team will be instrumental in providing the first large-scale computer models of the Earth’s environment (GEMS or “General Ecosystem Models”, which will be similar to the existing GCMs or General Circulation Models used by climate scientists) and whose output will presumably be used by IPBES to inform government environmental policy across the world .
It will be interesting to learn about the assumptions that will be fed into these new computer models, and it will also be interesting to find out in due course whether they too will tell us that extinctions are occurring at 1,000 times the background rate and that up to 20% of all species might go extinct during the next 30 years. It would perhaps not be much of a great surprise if this was to turn out to be the case. Whether or not the models’ output will accurately reflect reality, however, is another matter entirely.
1. Microsoft’s Stephen Emmott Sounds Alarm On Population Surge In Theatrical Lecture. Article by Chris Rhodes:
2 The biodiversity crisis: Worse than climate change:
3.The Sixth Extinction: The Most Recent Animal Extinctions:
4. Birdlife International: Birds on the IUCN Red List:
5. IUCN: Extinctions in Recent Time:
6. Edward O. Wilson. The Future of Life. Knopf, 2002, pp 100-102.
7. Edward O. Wilson. The Future of Life. Knopf, 2002, p 99.
8. IUCN: How many species are there? We need to know:
9. How Many Species Are There on Earth and in the Ocean? Camilo Mora et al, PLoS Biology, 2011
10.ASU: International Institute for Species Exploration:
11. Loehle, Craig, and Willis Eschenbach. Historical bird and terrestrial mammal extinction rates and causes. Diversity and Distributions, 2011:
12. E.O. Wilson: Only Humans Can Halt the Worst Wave of Extinction Since the Dinosaurs Died:
13 Species–area relationships always overestimate extinction rates from habitat loss. Fangliang He & Stephen P. Hubbell: Nature, 2011:
14. Dr. R K Pachauri: The IPCC Fourth Assessment Working Group Reports: Key findings:
15. Extinction Risk from Climate Change: Chris D Thomas et al. Nature, 2004:
16. BBC News: Climate risk ‘to million species’:
17. Donna Laframboise: Extinction Fiction:
18. NSF: Loss of Biological Diversity: A Global Crisis Requiring International Solutions:
19. The Global 2000 Report to the President, Vol 1:
20. The Global 2000 Report to the President, Vol 1, p37:
21. Julian Simon: The Ultimate Resource II: People, Materials, and Environment:
22. Norman Myers: The Sinking Ark: Pergamon, 1979.
23. Stephen Budiansky: The teflon doomsayers:
24. Intergovernmental Platform on Biodiversity and Ecosystem Services:
25. FT Magazine: This rare GEM can model our world: