Category Archives: Conservation

The Return of Lahontan Cutthroat Trout

The New York Times had a terrific story on Wednesday on the recovery of an endemic trout previously believed to be extinct since the 1940s in Pyramid Lake, Nevada. As I am currently teaching my class, Ecology, Evolution, and Human Health, with its emphasis on adaptation as local process and human-environment interaction, I was happy to see such an excellent story about local adaptation. In a nutshell, the trout was over-fished and also suffered devastating population declines in Pyramid Lake because of predation from introduced brook trout (and other exotic salmonids) and hybridization with introduced rainbows. This is, alas, an all too common story for trout endemics of western North America. A remanent population of Lahontan cutthroats, that were genetically very similar to the original Pyramid stock, was found in a Pilot Peak stream near the Utah border and samples from this population were brought to a USFWS breeding facility in cooperation with the Paiute Nation.  It sounds like the breeding/stocking program has been a tremendous success and the Lahontan cutties have now returned to Pyramid Lake. A big part of the story appears to be the intensive management of the main prey item of Lahontan cutties, the cui-ui sucker, which was devastated  following the construction of the Derby Dam in 1905.

This was all great news, but the thing that really caught my attention (because I’m currently teaching this class that focuses on adaptation) was the fact that the re-introduced Lahontan cutties have thrived so rapidly:

Since November, dozens of anglers have reported catching Pilot Peak cutthroats weighing 15 pounds or more. Biologists are astounded because inside Pyramid Lake these powerful fish, now adolescents, grew five times as fast as other trout species and are only a third of the way through their expected life span.

Can you say adaptation?! There is something about the interaction between this particular cutthroat species and the environment of Pyramid Lake that makes for giant fish as long as the juveniles can escape predation by exotic salmonids and adults can prey on their preferred species. Great news for anglers, great news for the Paiute Nation, great news for ecology.

Ecology and Evolution of Infectious Disease, 2013

I am recently back from the Ecology and Evolution of Infectious Disease (EEID) Principal Investigators’ Meeting hosted by the Odum School of Ecology at the University of Georgia in lovely Athens. This is a remarable event, and a remarkable field, and I can’t remember ever being so energized after returning from a professional conference (which often leave me dismayed or even depressed about my field). EEID  is an innovative, highly interdisciplinary funding program jointly managed by the National Science Foundation and the National Institutes of Health. I have been lucky enough to be involved with this program for the last six years. I’ve served on the scientific review panel a couple times and am now a Co-PI on two projects.

We had a big turn-out for our Uganda team in Athens and team members presented no fewer than four posters. The Stanford social networks/human dimensions team (including Laura Bloomfield, Shannon Randolph and Lucie Clech) presented a poster (“Multiplex Social Relations and Retroviral Transmission Risk in Rural Western Uganda”) on our preliminary analysis of the social network data. Simon Frost’s student at Cambridge, James Lester, presented a poster (“Networks, Disease, and the Kibale Forest”) analyzing our syndromic surveillance data. Sarah Paige from Wisconsin presented a poster on the socio-economic predictors of high-risk animal contact (“Beyond Bushmeat: Animal contact, injury, and zoonotic disease risk in western Uganda”) and Maria Ruiz-López, who works with Nelson Ting at Oregon, presented a poster on their work on developing the resources to do some serious population genetics on the Kibale red colobus monkeys (“Use of RNA-seq and nextRAD for the development of red colobus monkey genomic resource”).

Parviez Hosseini, from the EcoHealth Alliance, also presented a poster for our joint work on comparative spillover dynamics of avian influenza (“Comparative Spillover Dynamics of Avian Influenza in Endemic Countries”). I’m excited to get more work done on this project which is possible now that new post-doc Ashley Hazel has arrived from Michigan. Ashley will oversee the collection of relational data in Bangladesh and help us get this project into high gear.

The EEID conference has a unique take on poster presentations which make it much more enjoyable than the typical professional meeting. In general, I hate poster sessions. Now, don’t get me wrong: I see lots of scientific value in them and they can be a great way for people to have extended conversations about their work. They can be an especially great forum for students to showcase their work and start the long process of forming professional networking. However, there is an awkwardness to poster sessions that can be painful for the hapless conference attender who might want, say, to walk through the room in which a poster session is being held. These rooms tend to be heavy with the smell of desperation and one has to negotiate a gauntlet of suit-clad, doe-eyed graduate students desperate to talk to anyone who will listen about their work. “Please talk to me; I’m so lonely” is what I imagine them all saying as I briskly walk through, trying to look busy and purposeful (while keeping half an eye out for something really interesting!).

The scene at EEID is much different. All posters go up at the same time and the site-fidelity of poster presenters is the lowest I have ever seen. It has to be since, if everyone stuck by their poster, there wouldn’t be anyone to see any of them! What this did was allow far more mixing than I normally see at such sessions and avoid much of the inherent social awkwardness of a poster session. Posters also stayed up long past the official poster session. I continued to read posters for at least a day after the official session ended. Of course, it helps that there was all manner of great work being presented.

There were lots of great podium talks too. I was particularly impressed with the talks by Charlie King of Case Western on polyparasitism in Kenya, Maria Diuk-Wasser of Yale on the emergence of babesiosis in the Northeast, Jean Tsao (Michigan State) and Graham Hickling‘s (Tennessee) joint talk on Lyme disease in the Southeast, and Bethany Krebs’s talk on the role of robin social behavior in West Nile Virus outbreaks. Laura Pomeroy, from Ohio State, represented one of the other few teams with a substantial anthropological component extremely well, talking about the transmission dynamics of foot-and-mouth disease in Cameroon. Probably my favorite talk of the weekend was the last talk by Penn State’s Matt Thomas. They done awesome work elucidating the role of temperature variability on the transmission dynamics of malaria.

It turns out that this was the last EEID PI conference. Next year, the EEID PI conference will be combined with the other EEID conference which was originally organized at Penn State (and is there again this May). This combining of forces is, I’m sure, a good thing as it will reduce confusion and probably make it more likely that all the people I want to see have a better chance of showing up. I just hope that this new, larger conference retains the charms of the EEID PI conference.

EEID is a new, interdisciplinary field that has grown thanks to some disproportionately large contributions of a few, highly energetic people. One of the principals in this realm is definitely Sam Scheiner, the EEID program officer at NSF.  The EEID PI meeting has basically been Sam’s baby for the past 10 years. Sam has done an amazing job creating a community of interdisciplinary scholars and I’m sure I speak for every researcher who has been heavily involved with EEID when I express my gratitude for all his efforts.

On The Dilution Effect

A new paper written by Dan Salkeld (formerly of Stanford), Kerry Padgett (CA Department of Public Health), and myself just came out in the journal Ecology Letters this week.

One of the most important ideas in disease ecology is a hypothesis known as the “dilution effect”. The basic idea behind the dilution effect hypothesis is that biodiversity — typically measured by species richness, or the number of different species present in a particular spatially defined locality — is protective against infection with zoonotic pathogens (i.e., pathogens transmitted to humans through animal reservoirs). The hypothesis emerged from analysis of Lyme disease ecology in the American Northeast by Richard Ostfeld and his colleagues and students from the Cary Institute for Ecosystem Studies in Millbrook, New York. Lyme disease ecology is incredibly complicated, and there are a couple different ways that the dilution effect can come into play even in this one disease system, but I will try to render it down to something easily digestible.

Lyme disease is caused by a spirochete bacterium Borrelia burgdorferi. It is a vector-borne disease transmitted by hard-bodied ticks of the genus >Ixodes. These ticks are what is known as hemimetabolous, meaning that they experience incomplete metamorphosis involving larval and nymphal stages. Rather than a pupa, these larvae and nymphs resemble little bitty adults. An Ixodes tick takes three blood meals in its lifetime: one as a larva, once as a nymph, once as an adult. At different life-cycle stages, the ticks have different preferences for hosts. Larval ticks generally favor the white-footed mouse (Peromyscus leucopus) for their blood meal and this is where the catch is. It turns out that white-footed mice are extremely efficient reservoirs for Lyme disease. In fact, an infected mouse has as much as a 90% chance of transmitting infection to a larva feeding on it. The larvae then molt into nymphs and overwinter on the forest floor. Then, in spring or early summer a year after they first hatch from eggs, nymphs seek vertebrate hosts. If an individual tick acquired infection as a larva, it can now transmit to its next host. Nymphs are less particular about their choice of host and are happy to feed on humans (or just about any other available vertebrate host).

This is where the dilution effect comes in. The basic idea is that if there are more potential hosts such as chipmunks, shrews, squirrels, or skunks, there are more chances that an infected nymph will take a blood meal on a person. Furthermore, most of these hosts are much less efficient at transmitting the Lyme spirochete than are white-footed mice. This lowers the prevalence of infection and makes it more likely that it will go extinct locally. It’s not difficult to imagine the dilution effect working at the larval stage blood-meal too: if there are more species present (and the larvae are not picky about their blood meal), the risk of initial infection is also diluted.

In the highly-fragmented landscape of northeastern temperate woodlands, when there is only one species in a forest fragment, it is quite likely that it will be a white-footed mouse. These mice are very adaptable generalists that occur in a wide range of habitats from pristine woodland to degraded forest. Therefore, species-poor habitats tend to have mice but no other species. The idea behind the dilution effect is that by adding different species to the baseline of a highly depauperate assemblage of simply white-footed mice, the prevalence of nymphal infection will decline and the risk for zoonotic infection of people will be reduced.

It is not an exaggeration to say that the dilution-effect hypothesis is one of the two or three most important ideas in disease ecology and much of the explosion of interest in disease ecology can be attributed in part to such ideas. The dilution effect is also a nice idea. Wouldn’t it be great if every dollar we invested in the conservation of biodiversity potentially paid a dividend in reduced disease risk? However, its importance to the field or the beauty of the idea do not guarantee that it is actually scientifically correct.

One major issue with the dilution effect hypothesis is its problem with scale, arguably the central question in ecology. Numerous studies have shown that pathogen diversity is positively related to overall biodiversity at larger spatial scales. For example, in an analysis of global risk of emerging infectious diseases, Kate Jones and her colleagues form the London Zoological Society showed that globally, mammalian biodiversity is positively associated with the odds of an emerging disease. Work by Pete Hudson and colleagues at the Center for Infectious Disease Dynamics at Penn State showed that healthy ecosystems may actually be richer in parasite diversity than degraded ones. Given these quite robust findings, how is it that diversity at a smaller scale is protective?

We use a family of statistical tools known as “meta-analysis” to aggregate the results of a number of previous studies into a single synthetic test of the dilution-effect hypothesis. It is well known that inferences drawn from small samples generally have lower precision (i.e., the estimates carry more uncertainty) than inferences drawn from larger samples. A nice demonstration of this comes from the classical asymptotic statistics. The expected value of a sample mean is the “true mean” of a normal distribution and the standard deviation of this distribution is given by the standard error, which is defined as the standard deviation of the distribution divided by the square root of the sample size. Say that for two studies we estimate the standard deviation of our estimate of the mean to be 10. In the first study, this estimate is based on a single observation, whereas in the second, it is based on a sample of 100 observations. The estimated of the mean in the second study is 10 times more precise than the estimate based on the first because 10/\sqrt{1} = 10 while 10/\sqrt{100} = 1.

Meta-analysis allows us to pool estimates from a number of different studies to increase our sample size and, therefore, our precision. One of the primary goals of meta-analysis is to estimate the overall effect size and its corresponding uncertainty. The simplest way to think of effect size in our case is the difference in disease risk (e.g., as measured in the prevalence of infected hosts) between a species rich area and a species poor area. Unfortunately, a surprising number of studies don’t publish this seemingly basic result. For such studies, we have to calculate a surrogate of effect size based on the reported test statistics of the hypothesis that the authors report. This is not completely ideal — we would much rather calculate effect sizes directly, but to paraphrase a dubious source, you do a meta-analysis with the statistics that have been published, not with the statistics you wish had been published. On this note, one of our key recommendations is that disease ecologists do a better job reporting effect sizes to facilitate future meta-anlayses.

In addition to allowing us to estimate the mean effect size across studies and its associated uncertainty, another goal of meta-analysis is to test for the existence of publication bias. Stanford’s own John Ioannidis has written on the ubiquity of publication bias in medical research. The term “bias” has a general meaning that is not quite the same as the technical meaning. By “publication bias”, there is generally no implication of nefarious motives on the part of the authors. Rather, it typically arises through a process of selection at both the individual authors’ level and the institutional level of the journals to which authors submit their papers. An author, who is under pressure to be productive by her home institution and funding agencies, is not going to waste her time submitting a paper that she thinks has a low chance of being accepted. This means that there is a filter at the level of the author against publishing negative results. This is known as the “file-drawer effect”, referring to the hypothetical 19 studies with negative results that never make it out of the authors’ desk for every one paper publishing positive results. Of course, journals, editors, and reviewers prefer papers with results to those without as well. These very sensible responses to incentives in scientific publication unfortunately aggregate into systematic biases at the level of the broader literature in a field.

We use a couple methods for detecting publication bias. The first is a graphical device known as a funnel plot. We expect studies done on large samples to have estimates of the effect size that are close to the overall mean effect because estimates based on large samples have higher precision. On the other hand, smaller studies will have effect-size estimates that are more distributed because random error can have a bigger influence in small samples. If we plot the precision (e.g., measured by the standard error) against the effect size, we would expect to see an inverted triangle shape — or a funnel — to the scatter plot. Note — and this is important — that we expect the scatter around the mean effect size to be symmetrical. Random variation that causes effect-size estimates to deviate from the mean are just as likely to push the estimates above and below the mean. However, if there is a tendency to not publish studies that fail to support the hypothesis, we should see an asymmetry to our funnel. In particular, there should be a deficit of studies that have low power and effect-size estimates that are opposite of the hypothesis. This is exactly what we found. Only studies supporting the dilution-effect hypothesis are published when they have very small samples. Here is what our funnel plot looked like.

Note that there are no points in the lower right quadrant of the plot (where species richness and disease risk would be positively related).

While the graphical approach is great and provides an intuitive feel for what is happening, it is nice to have a more formal way of evaluating the effect of publication bias on our estimates of effect size. Note that if there is publication bias, we will over-estimate our precision because the studies that are missing are far away from the mean (and on the wrong side of it). The method we use to measure the impact of publication bias on our estimate of uncertainty formalizes this idea. Known as “trim-and-fill“, it uses an algorithm to find the most divergent asymmetric observations. These are removed and the precision of the mean effect size is calculated. This sub-sample is known as the “truncated” sample. Then a sample of missing values is imputed (i.e., simulated from the implied distribution) and added to the base sample. This is known as the “augmented” sample. The precision is then re-calculated. If there is no publication bias, these estimates should not be too different. In our sample, we find that estimates of precision differ quite a bit between the truncated and augmented samples. We estimate that between 4-7 studies are missing from the sample.

Most importantly, we find that the 95% confidence interval for our estimated mean effect size crosses zero. That is, while the mean effect size is slightly negative (suggesting that biodiversity is protective against disease risk), we can’t confidently say that it is actually different than zero. Essentially, our large sample suggests that there is no simple relationship between disease risk and biodiversity.

On Ecological Mechanisms One of the main conclusions of our paper is that we need to move beyond simple correlations between species richness and disease risk and focus instead on ecological mechanisms. I have no doubt that there are specific cases where the negative correlation between species richness and disease risk is real (note our title says that we think this link is idiosyncratic). However, I suspect where we see a significant negative correlation, what is really happening is that some specific ecological mechanism is being aliased by species richness. For example, a forest fragment with a more intact fauna is probably more likely to contain predators and these predators may be keeping the population of efficient reservoir species in check.

I don’t think that this is an especially controversial idea. In fact, some of the biggest advocates for the dilution effect hypothesis have done some seminal work advancing our understanding of the ecological mechanisms underlying biodiversity-disease risk relationships. Ostfeld and Holt (2004) note the importance of predators of rodents for regulating disease. They also make the very important point that not all predators are created equally when it comes to the suppression of disease. A hallmark of simple models of predation is the cycling of abundances of predators and prey. A specialist predator which induces boom-bust cycles in a disease reservoir probably is not optimal for infection control. Indeed, it may exacerbate disease risk if, for example, rodents become more aggressive and are more frequently infected in agonistic encounters with conspecifics during steep growth phases of their population cycle. This phenomenon has been cited in the risk of zoonotic transmission of Sin Nombre Virus in the American Southwest.

I have a lot more to write on this, so, in the interest of time, I will end this post now but with the expectation that I will write more in the near future!

 

Three Questions About Norms

Well, it certainly has been a while since I’ve written anything here. Life has gotten busy with new projects, new responsibilities, etc. Yesterday, I participated in a workshop on campus sponsored by the Woods Institute for the Environment, the Young Environmental Scholars Conference. I was asked to stand-in for a faculty member who had to cancel at the last minute. I threw together some rather hastily-written notes and figured I’d share them here (especially since I spoke quite a bit of the importance for public communication!).

The theme of the conference was “Environmental Policy, Behavior, and Norms” and we were asked to answer three questions: (1) What does doing normative research mean to you? (2) How do your own norms and values influence your research? (3) What room and role do you see for normative research in your field? So, in order, here are my answers.

What does doing normative research mean to you?

I actually don’t particularly like the term “normative research” because it sounds a little too much like imposing one’s values on other people. I am skeptical of the imposition of norms that have more to do with (often unrecognized) ideology and less about empirical truth – an idea that was later reinforced by a terrific concluding talk by Debra Satz. If I can define “normative” to mean with the intent to improve people’s lives, then OK.  Otherwise, I prefer to do “positive” research.

For me, normative research is about doing good science. As a biosocial scientist with broad interests, I wear a lot of hats. I have always been interested in questions about the natural world, and (deep) human history in particular. However, I find that the types of questions that really hold my interest these days are more and more engaged in the substantial challenges we face in the world with inequality and sustainability. In keeping with my deep pragmatist sympathies, I increasingly identify with Charles Sanders Pierce‘s idea that given the “great ocean of truth” that can potentially be uncovered by science, there is a moral burden to do things that have social value. (As an aside, I think that there is social value in understanding the natural world, so I don’t mean to imply a crude instrumentalism here.) In effect, there is a lot of cool science to be done; one may as well do something of relevance.  I personally have little patience for people who pursue racist or otherwise socially divisive agendas and cloak their work in a veil of  free scientific inquiry.  This said, I worry when advocacy interferes with intellectual fairness or an unwillingness to accept that one’s position is not actually true.

I think that we are fooling ourselves if we believe that our norms somehow don’t have an effect on our research.  Recognizing what these norms that shape your research – whether implicitly or explicitly – helps you manage your bias. Yes, I said manage. I’m not sure we can ever completely eliminate it. I see this as more of a management of a necessary trade-off, drawing an analogy between the practice of science and a classic problem in statistics, between bias and variance. The more biased one is, the less variance there is in the outcome of one’s investigation. The less bias, the greater the likelihood that results will differ from one’s expectations (or wishes). Recognizing how norms shape our research also deals with that murky area of pre-science: where do our ideas for what to study come from?

How do your own norms and values influence your research?

Some of the the norms that shape my own research and teaching include:

transparency: science works best when it is open. This places a premium on sharing data, methods, and communicating results in a manner that maximizes access to information. As a simple example, this norm shapes my belief that we should not train students from poor countries in the use of proprietary software (and other technologies) that they won’t be able to afford when they return to their home countries when there are free or otherwise open-source alternatives.

fairness: this naturally includes a sense of social justice or people playing on an equal playing field, but it also includes fairness to different ideas, alternative hypotheses, the possibility that one is wrong. This type of fairness is essential for one’s credibility as a public intellectual in science (particularly supporting policy), as noted eloquently in this interview with Dick Lewontin.

respect for people’s ultimate rationality: Trying to understand the social, ecological, and economic context of people’s decision-making, even if it violates our own normative – particularly market-based economic – expectations.

flexibility: solving real problems means that we need to be flexible in our approach, willing to go where the solutions lead us, learning new tools and collaborating. Flexibility also means a willingness to give up on a research program that is doing harm.

good-faith communication: I believe that there is no room for obscurantism in the academy of the 21st century. This includes public communication. There are, of course, complexities here with regard to the professional development of young scholars.  One of the key trade-offs for young scholars is the need for professional advancement (which comes from academic production) and activism, policy, and public communication. Within the elite universities, the reality is that neither public communication nor activism count much for tenure. However, as Jon Krosnick noted, tenure is a remarkable privilege and, while it may seem impossibly far away for a student just finishing a Ph.D., it’s not really. Once you prove that you have the requisite disciplinary chops, you have plenty of time to to use tenure for what it is designed for (i.e., protecting intellectual freedom) and engaging in critical public debate and communication.

humility: solving problems (in science and society) means caring more about the answer to a problem than one’s own pet theory. Humility is intimately related to respect for others’ rationality.  It also means recognizing the inherently collaborative nature of contemporary science: giving credit where it is due, seeking help when one is in over one’s head, etc. John DeGioia, President of Georgetown University, quoted St. Augustine in his letter of support for Georgetown Law Student, Sandra Fluke against the crude attacks by radio personality Rush Limbaugh and I think those words are quite applicable here as well.  Augustine implored his interlocutors to “lay aside arrogance” and to “let neither of us assert that he has found the truth; let us seek it as if it were unknown to both.” This is not a bad description of the way that science really should work.

What room and role do you see for normative research in your field?

I believe that there is actually an enormous amount of room for normative research, if by “normative research,” we mean research that has the potential to have a positive effect on people’s lives. If instead we mean imposing values on people, then I am less sure of its role.

Anthropology is often criticized from outside the field, and to a lesser extent, from within it for being overly politicized. You can see this in Nicholas Wade’s critical pieces in the New York Times Science Times section following the American Anthropological Association’s executive committee excising of the word “science” from the field’s long-range planning document. Wade writes,

The decision [to remove the word ‘science’ from the long-range planning document] has reopened a long-simmering tension between researchers in science-based anthropological disciplines — including archaeologists, physical anthropologists and some cultural anthropologists — and members of the profession who study race, ethnicity and gender and see themselves as advocates for native peoples or human rights.

This is a common sentiment. And it is a complete misunderstanding. It suggests that scientists can’t be advocates for native peoples or human rights.  It also suggests that one can’t study race, ethnicity, or gender from a scientific perspective.  Both these ideas are complete nonsense.  For all the leftist rhetoric, I am not impressed with the actual political practice of what I see in contemporary anthropology. There is plenty of posturing about power asymmetries and identity politics but it is always done in such a mind-numbingly opaque language and with no apparent practical tie-in to policies that make people’s lives better. And, of course, there is the outright disdain for “applied” work one sees in elite anthropology departments.

Writing specifically about Foucault, Chomsky captured my take on this whole mode of intellectual production:

The only way to understand [the mode of scholarship] is if you are a graduate student or you are attending a university and have been trained in this particular style of discourse. That’s a way of guaranteeing…that intellectuals will have power, prestige and influence. If something can be said simply, say it simply, so that the carpenter next door can understand you. Anything that is at all well understood about human affairs is pretty simple.

Ultimately, the simple truths about human affairs that I find anyone can relate to are subsistence, health, and the well-being of one’s children. These are the themes at the core of my own research and I hope that the work I do ultimately can effect some good in these areas.

Dan Salkeld on the Radio!

I was thrilled to hear Dan Salkeld‘s excellent (and long!) radio interview on Colorado Public Radio about our recent paper on understanding plague epizootics in prairie dogs.  There is a remarkable amount of information contained in this interview.  If you want to learn about plague ecology, then this is an excellent introduction.

Dan Salkeld with a prairie dog.
Dan Salkeld with a prairie dog.

The Little Mouse on the Prairie

We have a new paper in the Early Edition of PNAS on the ecology of plague in prairie dogs. The Stanford News Service did a nice little write-up of the paper (and Mark Shwartz’s full version is available on the Woods Institute site) and it has now been picked up by a number of media outlets including USA Today, ScienceDaily, The Register (UK), as well as a couple of radio news shows. This paper has been a real pleasure for me because of my incredible collaborators.  Dan Salkeld, who has been a post-doctoral fellow with me and now splits his time between teaching in Human Biology at Stanford and working as an epidemiologist for the California Department of Health, is the lead author.  Dan is clearly one of the leading young disease ecologists working today and his understanding of the field and willingness to do the sometimes unglamorous grunt work of ecology in pursuit of important research questions continually impresses me. The paper uses data that he collected while he worked for co-author Paul Stapp on Paul and collaborators’ plague project in the Pawnee National Grasslands in Colorado. Dan and Paul had the idea that grasshopper mice (see below) might have something to do with the episodic plague outbreaks in prairie dog towns.  Apparently, this idea was met with skepticism by their colleagues. When Dan came to Stanford, I suggested that we could probably put together a model to test the hypothesis. While we were waiting for our research permits to come through for a project in Indonesia (also dealing with plague; another long story), we decided to take up the challenge. What really made the whole project come together was the fortuitous office-pairing of Dan with Marcel Salathé, another post-doc with whom I have collaborated extensively on questions of social networks and infectious disease.  In addition to being a brilliant theoretical biologist, Marcel is an ace Java programmer.  Following a few white-board sessions in the studio near our offices, Dan and Marcel put together an amazing computer simulation that achieves that perfect balance between simplicity and realism that allows for scientific insight.

I don’t think anyone would have predicted this particular collaboration and this particular outcome.  The results described in this paper come from an incredibly interdisciplinary collaboration. I am really struck at how great science can come from a few simple ingredients: (1) long-term ecological data collection facilitated by a visionary program at the National Science Foundation, (2) a space where people from quite different disciplines and with different scientific sensibilities can get together and brain-storm, (3) flexible funding that permits researchers to explore the interesting – if offbeat – scientific questions that arise from such interactions.  So, I have many debts to acknowledge for this one.  The field data come from the project for which Mike Antolin at Colorado State is the PI (out co-author Paul Stapp is a Co-PI for that as well).  The funding source for this project was the joint NSF/NIH Ecology of Infectious Disease program. This is a cross-cutting program that “supports the development of predictive models and the discovery of principles governing the transmission dynamics of infectious disease agents” (from the EID home page).  The space – both physical and intellectual – that permitted this work to happen was provided by the Woods Institute for the Environment.  This paper literally came into being in the project studio on the third floor of Y2E2 in the Land Use and Conservation area.  Amazingly, this is exactly what these studios were designed to do.  My office in Y2E2 has adjoining office space for grad-students and post-docs and this is where Marcel and Dan did most of their hashing. It was always amusing to pop my head in and see them both huddled around a computer, having animated discussions about how best to represent the complex ecology in a computational model that is simple enough to understand and flexible enough to allow us to test hypotheses.  Finally, funding. Dan was funded by a Woods Environmental Ventures Project grant for which I am the PI.  Marcel was funded by the Branco Weiss Science in Society Fellowship.  My own flexibility was assured by a career grant form the National Institutes of Health. Research funding is almost always important, but the requirements of research funding can sometimes be too constraining to permit exploration of really new ideas.  All three of these mechanisms (Woods EVP, Branco Weiss, NIH K01) provide exactly the type of flexibility that fosters creativity. I wish there were more programs like these.

One of the fundamental questions in disease ecology is how extremely pathogenic infectious agents persist both through time and across landscapes.  Plague is a bacterial disease that affects a wide range of rodents throughout the world and, in North America, particularly afflicts prairie dogs (Cynomys ludovicianus).  Plague epizootics (the animal equivalent of epidemics in humans) are dramatic affairs with almost complete mortality of massive prairie dog ‘towns’ of thousands of animals. If plague is so deadly to prairie dogs, how does it persist?  Is there another reservoir (i.e., an other host species that can maintain an infection in the absence of prairie dogs)?  Does plague get into the soil and persist in some sort of suspended state (the way that some Mycobacteria do, for example) waiting to reinfect a re-colonized prairie dog town? Or is plague really enzootic (i.e., when an infection persists at low levels in an animal population) and we just haven’t detected it? This question has wide applicability. Consider diseases of people such as Ebola Hemorrhagic Fever or SARS, or, going back a few hundred years in human history, that nastiest of bacterial diseases, bubonic plague. Yes, the same beastie.  A disease that killed a third of the population of Europe in the fourteenth century exists in prairie dogs in North America today (and sometimes spills over to produce human infections).

Prairie dogs are a keystone species of the grasslands of the American West. They are  threatened by various anthropogenic forces, including habitat destruction and human persecution.  But most importantly, prairie dog viability is threatened by plague.

Plague, a disease caused by the bacterium (Yersinia pestis) and the causative agent of Black Death, arrived in USA via San Francisco ca. 1900, and still infects (or threatens to infect) people each year, including in California. Plague killed as many as 200 million people in Medieval Europe. It is still important in Africa and Asia.  There have been sizable epidemics as recently as the middle twentieth century in India and China and a substantial outbreak in Surat, India in 1994 that, in addition to death, caused widespread panic and social disruption.

Previous modeling and ecological work tended to assume that die-outs occur very rapidly.  But questions dogged this work (as it were): were the apparently rapid die-offs simply an artifact of finally seeing dead dogs dropping all over the place? Prairie dogs do live underground, after all, and they live in enormous towns.  Who would miss a few dead dogs underground in a town of thousands?  Our paper suggests that previous modeling efforts get the story wrong. They fail to account for observed patterns because they missed key elements of the picture.  Previous models that could describe the phenomena lacked an actual explanation – it’s a magical reservoir? It’s a carnivore? Certainly it’s something somewhere?

While prairie dogs live in enormous towns, they are highly territorial within the towns.  Towns form because of the benefits of predator defense. They live in small family groups known as coteries, and these coteries form a more-or-less regular grid of small defended territories within the towns. Because of this regular structure induced by their territoriality, a directly-transmitted infectious disease can only move so quickly through a town since it could only be transmitted to immediate neighbors and each coterie only has a couple of these.  Plague is not directly transmitted though.  It is carried by flea vectors, but if the dispersal distance of a flea is less than the diameter of a coterie’s territory, then the transmissibility of this vector-borne disease is similar to something that is directly transmitted.  Prairie dogs are territorial and this territoriality limits the rate of disease propagation through prairie dog towns. However, prairie dogs are not alone on their eponymous prairies.

Grasshopper mice – smelly, carnivorous mice, happy to eat through prairie dog carcasses – get swamped by fleas that normally live on prairie dogs. And grasshopper mice have no respect for prairie dog territories. They spread fleas across prairie dog coteries. This is the critical piece of the puzzle provided by our analysis.  Grasshopper mice are the key amplifying hosts for plague in prairie dogs.  Grasshopper mice increase the spread of disease by moving fleas across the landscape, similar to the way that highly promiscuous people may spread HIV or so-called ‘super-spreders‘ transmitted SARS in the global outbreak of 2003.  Of course, there are interesting differences between the plague model and these other diseases. Grasshopper mice are like super-spreaders in that they push the system over the percolation threshold.  They are unlike super-spreaders in that they don’t have that many more contacts than the average – they just connect otherwise unconnected segments of a population already near the threshold of an epidemic.

Without grasshopper mice, plague still kills prairie dog families, one at a time, but it moves very slowly, and it is extremely hard to detect (who misses 5 dead prairie dogs in a colony that stretches for 200 hectares and has upwards of 5000 animals?).  The grasshopper mice take a spatially-organized system that is on the verge of an epizootic and push it over the threshold.  The term ‘percolation threshold’ in the title of our paper relates to a branch of theory from geophysics that explains how and when a fluid can pass through a porous random medium.  This theory uses random graphs, which are the same mathematical structure that we use to model social networks, to understand when, for example, a medium will let water pass through it – i.e., to percolate. When the density of pores in, say, a layer of sandstone passes a critical density, water can pass from the surface through to recharge the aquifer. Similarly, when the density of susceptible prairie dog families crosses a critical threshold, plague can sweep through and wipe out a town of thousands of individuals.  The spatial structure induced by prairie dog territoriality turns out, on average, to be not quite at the percolation threshold (though it’s close).  What the grasshopper mice do is provide the critical connectivity that puts the system over the threshold and allows a slowly simmering enzootic infection turn into a full-blown epizootic.

It is in thinking about percolation thresholds that we see how important the behavior of affected species is for understanding disease dynamics.  Plague in Asian great gerbils, while effectively modeled using the same mathematical formalism, only requires one species in order to achieve the percolation threshold. Because great gerbils roam more widely and mix more, what matters for plague epizootics in this species is simply overall gerbil density.

It seems quite likely that this pattern of diseases smoldering at low-level below the detection threshold before some dramatic occurrence brings them to general attention is common, particularly with emerging infections. For example, there is evidence for extensive transmission of H1N1 ‘swine flu’ in Mexico before a large number of deaths appeared seemingly quite suddenly in April of 2009.  A number of other diseases – both of people and wildlife – show this pattern of being seemingly completely lethal, burning through host communities, and disappearing only to reappear some years later.  Important examples include Ebola in both humans and gorillas, hantavirus in people, anthrax in zebra, or chytrid fungi and frogs.

What are the key take-home messages of this paper? There are five, as far as I see it: (1) plague is enzootic in prairie dogs and there is no need to posit an alternate reservoir, (2) this said, the transition from enzootic to epizootic infection in prairie dogs is mediated by grasshopper mice, (3) understanding disease ecology – including species interactions – is a key to understanding (and predicting) dynamics, (4) behavior matters for disease dynamics, and (5) epidemiological surveillance is essential for controlling infectious disease – just because you don’t see a disease, doesn’t mean it’s not there!

I’m sure I’ll have more to say about this.  I did want to note that the publication of this paper coincides with a personnel transition here in our group at Stanford. Marcel has moved on to a faculty position, joining the spectacular Center for Infectious Disease Dynamics at Penn State.  Peter Hudson and his crew have assembled an amazing and eclectic group of scientists in Happy Valley and kudos to them for landing Marcel.  I frequently think that only a total fool would pass up an offer to join this exciting and productive group, but that’s another story. I expect Marcel to do great things there and look forward to continued collaborations.

Follow-Up to the Sea Lion Die-Off

Information sent today from Promed-Mail indicates that domoic acid is indeed implicated in the sea-lion die-off in California.  Domoic acid is a neurotoxin produced by certain algal blooms that can bioaccumulate as it moves up trophic chains. Fisheries and Oceans Canada has a terrific resource page with extensive references on domoic acid. The report on the die-off comes from Southern California via the Ventura County Star, but it seems likely that it is related to the Central California die-off I wrote about last week. Also implicated — as I suggested last week — is a shortage in fish resulting from the El Niño warmed Pacific waters.

Unexplained Sea Lion Die-Off

I just read this story about the alarming number of dead sea lions showing up around Monterey Bay in Central California.  We were just down at Moss Landing State Beach and saw for ourselves evidence of this die-off.  The sea lion had a couple of bites taken out of it but its unclear whether they were pre- or post-mortem.

dead_sealionA likely candidate for the cause of this excess mortality is food shortages due to unusually warm waters from the recent El Niño conditions, though no one has yet (to my knowledge) ruled out infectious disease, domoic acid, red tide, or any of the other possible causes of such die-offs.

New Publication: Chimpanzee "AIDS"

keele_etal2009-first-pageA long-anticipated paper (by me anyway!) has finally been published in this week’s issue of Nature.  In this paper, we show that wild chimpanzees living in the Gombe National Park in western Tanzania on the shores of Lake Tanganyika appear to die from AIDS-like illness when infected with the Simian Immunodeficiency Virus (SIV).  Many African primates harbor their own species-specific strain of SIV and chimpanzees are no exception.  The host species for a particular SIV strain is indicated by a three letter abbreviation (all in lower-case) following the all-caps SIV. So, for chimpanzees, the strain is called SIVcpz. It turns out that there are two distinct HIVs, known as HIV-1 and HIV-2. HIV-1 is the virus that causes the majority of the world’s deaths.  It is what we call the “pandemic strain.” HIV-2 is less pathogenic and has a distinct geographic focus in West Africa.  The HIVs and the various SIVs belong to a larger group of viruses that infect a wide range of mammals known as the lentiviruses (lenti– meaning slow, referring to the slow time course of the pathology typically caused by these viruses). Collectively, we call the SIVs and HIVs “primate lentiviruses.”  Both HIV-1 and HIV-2 have well-documented origins in nonhuman primate reservoirs.  HIV-2 is most closely related to SIVsmm, a virus that infects sooty mangebeys (a type of West-African monkey).  HIV-1, on the other hand, is most closely related to SIVcpz, the virus that infects central and east African chimpanzees.  We believe that both HIV-1 and HIV-2 entered humans hosts when hunters were contaminated with the blood of infected monkeys (HIV-2) or chimpanzees (HIV-1). Note that this means that our terminology for the primate lentiviruses is polyphyletic.  SIVsmm and HIV-2 are sister species, while SIVcpz and HIV-1 are sister species.  Yet we call all the viruses that infect nonhuman primates simian and all the viruses that infect humans human immunodeficiency viruses.  It seems to me the best way to fix this would be to call the viruses that infect humans SIVhum1 and SIVhum2.  Of course, that will never happen, but I do think that it’s important to clarify the evolutionary history of these viruses.

The conventional wisdom regarding primate lentiviruses is that, with the exception of HIV, they are not pathogenic in their natural host.  The reasoning for why HIV causes the devastating pathology that characterizes AIDS goes that HIV-1 is a relatively new infection of humans, having just spilled over into the human population recently.  Pathogens that have recently crossed species boundaries are frequently highly pathogenic because neither the new host nor the pathogen has a history of coevolution with its new partner.  While it is a pernicious myth (that just won’t seem to die) that pathogens necessarily evolve toward a benign state, it is true that they frequently evolve a more intermediate level of virulence from their initial spillover virulence.  There are a number of problems with the idea that HIV causes AIDS because it is poorly adapted to human physiology.

The first of these is that HIV-1 is not that recent an infection of humans.  Sure, we didn’t notice it until 1983 but careful molecular evolutionary analysis by Bette Korber of the Santa Fe Institute and my collaborator Beatrice Hahn and her group at the University of Alabama Birmingham puts the most likely date for the emergence of HIV-1 in humans to be 1931.  That means that HIV-1 was being transmitted from human-to-human for over fifty years before it was ever noticed by western science. Fifty years, while certainly brief in evolutionary terms, is still long enough to lead to some reduction in virulence or host evolution.

The real nail in the coffin, however, is our new result.  Specifically, we show that SIVcpz causes AIDS-like pathology in the Gombe chimpanzees. This result is surprising because (1) given it’s pathogenicity, one would expect someone to have noticed it before, and (2) chimpanzees infected in captivity do not show obvious AIDS-like illness. I have been collaborating with Anne Pusey, Mike Wilson and their colleagues at the University of Minnesota’s Jane Goodall Institute Center for Primate Studies on the the analysis of the demography of the Gombe chimps for a number of years now. Anne and Mike have, in turn, been collaborating with Beatrice Hahn with her project on monitoring natural SIV infection in wild chimpanzees across Africa. Given my background in HIV epidemiology and statistics, it was only natural that we all join forces to look at the demographic implications of SIV infection among the Gombe chimps.  Jane Goodall famously started chimpanzee research at Gombe in 1960 and since 1964, researchers at Gombe have collected detailed demographic information, documenting all births, deaths, and migration events in the central community and eventually expanding to the peripheral ones in later years. As a result, we have an unmatched level of demographic detail (not to mention behavioral and ecological information) against which to assess the impact of SIV infection.  Using statistical methods known collectively as event-history analysis, we were able to show that the hazard ratio between SIV-infected and SIV-negative chimps is on the order of 10-16.  This essentially means that SIV+ chimps have mortality rates that are 10-16 times higher than uninfected chimps.  The analysis controls for the clear potentially confounding effects of age and sex on overall mortality. The reason why no one ever noticed this heightened mortality rate is really because no one has ever looked for it. Even when a mortality rate is 10 times higher for some segment of a population, when that segment is small and when mortality rates quite low (chimps who survive infancy can live in excess of 40 years) it can be hard to detect even a seemingly large difference.  This is why we do science: because things that seem obvious once we know they are there can be remarkably subtle when we don’t know they’re there.  Science gives us the framework and the tools for studying nature’s subtleties.

This project was absurdly interdisciplinary.  The paper has 22 co-authors, each contributing his or her own particular analytical expertise or providing access to crucial data necessary for the larger narrative.  There are papers in the literature in which people are made co-authors for pretty thin contributions.  This paper has none of that.  It was an extremely complicated story to tell and it really required the collaboration of this large team. Such work is not easy to manage and it’s not at all easy to do well.  I think that Beatrice should be commended for orchestrating all the various major contributions, keeping us in line and on schedule (more or less). It’s really gratifying to see the excellent blog piece by Carl Zimmer in which he notes the virtues — and the difficulty — of combining various scientific styles in pursuit of an important question. The title of Carl’s piece is “AIDS and the Virtues of Slow-Cooked Science.” In addition, there is a nice companion piece in this week’s Nature written by Robin Weiss and Jonathan Heeney.  They too note the strength of the interdisciplinary approach to this problem.

The paper isn’t even officially published until tomorrow and it has already been covered on Carl Zimmer’s blog for Discover Magazine, The New ScientistThe GuardianThe ScientistThe New York Times and MSNBC. Wow.  Weiss & Heeney note a number of questions that are raised by our analysis.  Specifically, they ask “why was the progression to AIDS-like illness not more apparent in chimpanzees in captivity?” My co-author Paul Sharp notes “We need to know much more about whether there are any genetic differences among the chimpanzees, or differences in co-infections with other viruses, bacteria or parasites, which influence whether or not SIV infection leads to illness or death. This presents a unique opportunity to compare and contrast the disease-causing mechanisms of two closely related viruses in two closely related hosts.”  Then, of course, there are the conservation questions that this paper raises.  Chimpanzees in the wild have birth rates that are very nearly balanced out by their death rates.  This difference, called the intrinsic rate of increase, largely determines the probability of extinction of a small population.  When the rate of increase of a population is negative, it is certain to go extinct (assuming the rate remains negative).  However, even if the intrinsic rate of increase is greater than zero, the randomness that besets small populations still means that a population can go extinct.  So, because their average birth and death rates are so close, individual chimp populations are certainly in potential jeopardy of going extinct, and Gombe is no exception to this rule. Now we add to a population something that increases mortality rates 10-16 times.  This is bound to have negative consequences for the persistence of affected chimp populations.  This is a topic that we are exploring even as I write…