Category Archives: Human Ecology

An Alternate Course Load for the Game of Life

In a recent editorial in the New York Times, Harvard economist and former chairman of the Council of Economic Advisers, N. Gregory Mankiw provides some answers to the question "what kind of foundation is needed to understand and be prepared for the modern economy?"  Presumably, what he means by "modern economy" is life after college.  Professor Mankiw suggests that students of all ages learn something about the following subjects: economics, statistics, finance, and psychology.  I read this with interest and doing so made me think of my own list, which is rather different than the one offered by Mankiw. I will take up the instrumental challenge, making a list of subjects that I think will be useful in an instrumental sense -- i.e., in helping graduates become successful in the world of the twenty-first century. In no way do I mean to suggest that students can not be successful if they don't follow this plan for, like Mankiw, I agree that students should ignore advice as they see fit. Education is about discovery as much as anything and there is much to one's education that transcends instrumentality -- going to college is not simply about preparing people to enter "the modern economy," even if it is a necessary predicate for success in it.

People should probably know something about economics.  However, I'm not convinced that what most undergraduate students are taught in their introductory economics classes is the most useful thing to learn. Contemporary economics is taught as an axiomatic discipline.  That is, a few foundational axioms (i.e., a set of primitive assumptions that are not proved but considered self-evident and necessary) are presented and from these, theorems can be derived.  Theorems can then be logically proven by recourse to axioms or other already-proven theorems. Note that this is not about explaining the world around us.  It is really an exercise in rigorously defining normative rules for how people should behave and what the consequences of such behavior would be, even if actual people don't follow such prescriptions. Professor Mankiw has written a widely used textbook in Introductory Economics. In the first chapter of this book, we see this axiomatic approach on full display.  We are told not unreasonable things like "People Face Trade-Offs" or "The Cost of Something is What You Give Up to Get It" or "Rational People Think at the Margin." I couldn't agree more with the idea that people face trade-offs, but I nonetheless think there are an awful lot of problematic aspects to these axioms.  Consider the following paragraph (p. 5)

Another trade-off society faces is between efficiency and equality. Efficiency means that society is getting the maximum benefits from its scarce resources. Equality means that those benefits are distributed uniformly among society’s members. In other words, efficiency refers to the size of the economic pie, and equality refers to how the pie is divided into individual slices.

Terms like "efficiency" and "maximum benefits" are presented as unproblematic, as is the idea that there is a necessary trade-off between efficiency and equality.  Because it is an axiom, apparently contemporary economic theory allows no possibility for equality in efficient systems. Inequality is naturalized and thereby legitimized. It seems to me that this should be an empirical question, not an axiom. In his recent book, The Bounds of Reason: Game Theory and the Unification of the Behavioral Sciences, Herb Gintis provides a very interesting discussion of the differences between two highly formalized (i.e., mathematical) disciplines, physics and economics.  Gintis notes, "By contrast [to the graduate text in quantum mechanics], the microeconomics text, despite its beauty, did not contain a single fact in the whole thousand page volume. Rather, the authors build economic theory in axiomatic fashion, making assumptions on the basis of their intuitive plausibility, their incorporation of the 'stylized facts' of everyday life, or their appeal to the principles of rational thought."

If one is going to learn economics, "the study of how society manages its scarce resources" -- and I do believe people should -- I think one should (1) learn about how  resources are actually managed by real people and real institutions and (2) learn some theory that focuses on strategic interaction.  A strategic interaction occurs when the best choice a person can make depends upon what others are doing (and vice-versa). The formal analysis of strategic interactions is done with game theory, a field typically taught in economics classes but also found in political science, biology, and, yes, even anthropology. Alas, this is generally considered an advanced topic, so you'll have to go through all the axiomatic nonsense to get to the really interesting stuff.

OK, that was a bit longer than I anticipated. Whew.  On to the other things to learn...

Learn something about sociology. Everyone could benefit by understanding how social structures, power relations, and human stocks and flows shape the socially possible. Understanding that social structure and power asymmetries constrain (or enable) what we can do and even what we think is powerful and lets us ask important questions not only about our society but of those of the people with whom we sign international treaties, or engage in trade, or wage war. Some of the critical questions that sociology helps us ask include: who benefits by making inequality axiomatic? Does the best qualified person always get the job? Is teen pregnancy necessarily irrational? Do your economic prospects depend on how many people were born the same year as you were? How does taste reflect on one's position in society?

People should definitely learn some statistics. Here, Professor Mankiw and I are in complete agreement.

Learn about people other than those just like you. The fact that we live in an increasingly global world is rapidly becoming the trite fodder of welcome-to-college speeches by presidents, deans, and other dignitaries. Of course, just because it's trite doesn't make it any less true, and despite the best efforts of homogenizing American popular and consumer culture, not everyone thinks or speaks like us or has the same customs or same religion or system of laws or healing or politics. I know; it's strange. One might learn about other people in an anthropology class, say, but there are certainly other options. If anthropology is the chosen route, I would recommend that one choose carefully, making certain that the readings for any candidate anthropology class be made up of ethnographies and not books on continental philosophy. Come to grips with some of the spectacular diversity that characterizes our species. You will be better prepared to live in the world of the twenty-first century.

Take a biology class. If the twentieth century was the century of physics, the twenty-first century is going to be the century of biology.  We have already witnessed a revolution in molecular biology that began around the middle of the twentieth century and continued to accelerate throughout its last decades and into the twenty-first. Genetics is creeping into lots of things our parents would not have even imagined: criminology, law, ethics. Our decisions about our own health and that of our loved ones' will increasingly be informed by molecular genetic information. People should probably know a thing or two about DNA. I shudder at popular representations of forensic science and worry about a society that believes what it sees on CSI somehow represents reality. I happen to think that when one takes biology, one should also learn something about organisms, but this isn't always an option if one is going to also learn about DNA.

Finally, learn to write.  Talk about comparative advantage! I am continually blown away by poor preparation that even elite students receive in written English. If you can express ideas in writing clearly and engagingly, you have a skill that will carry you far. Write as much as you possibly can.  Learn to edit. I think editing is half the problem with elite students -- they write things at the last minute and expect them to be brilliant.  Doesn't work that way. Writing is hard work and well written texts are always well edited.

Mutant Fungus Threatening World Wheat Supplies

A mutant strain of the wheat stem rust fungus, Puccinia graminis f. sp. tritici, has emerged that threatens as much as 60 million tons of world wheat production.  The story of this emergence can be found here.  There is a clearinghouse of information on the Borlaug Global Rust Initiative website. The emergence of such a potentially devastating crop pathogen highlights once again the practical importance of evolutionary biology for understanding major world problems.

Nice Piece on Burning in the Stanford Report

As part of a series of articles on interdisciplinary environmental research at Stanford, the Stanford Report has just published a nice piece on the research on Aboriginal burning in Western Australia led by Rebecca and Doug Bird. This work is supported by a grant from the Woods Institute Environmental Venture Project fund as well as a major grant from the National Science Foundation.  We have a fairly recent paper in PNAS that describes some of the major findings, which I have written about previously here.

We've got some exciting things in the works as a follow-up to this paper thanks to the EVP funding. These include agent-based models of foraging and its effects on landscape development and new statistical methods for characterizing the scale and pattern of burning-induced landscape mosaics.  We're also hoping to move into some comparative work across foraging populations and to expand upon the ecological interactions between human foragers and plant species upon which they depend.

Fold Catastrophe Model

My last post, which I had to cut short, discussed the recent paper by Scheffer et al. (2009) on the early warning signs of impending catastrophe. This paper encapsulates a number of things that I think are very important and relate to some current research (and teaching interests). Scheffer and colleagues show the consequences on time series of state observations when a dynamical system characterized by a fold bifurcation is forced across its attractor where parts of this attractor are stable and others are unstable.  In my last post, I described the fold catastrophe model as an attractor that looks like an "sideways N." I just wanted to briefly unpack that statement.  First, an attractor is kind of like an equilibrium.  It's a set of points to which a dynamical system evolves.  When the system is perturbed, it tends to return to an attractor.  Attractors can be fixed points or cycles or extremely complex shapes, depending upon the particulars of the system.

The fold catastrophe model posits an attractor that looks like this figure, which I have more or less re-created from Scheffer et al. (2009), Box 1.

foldThe solid parts of the curve are stable -- when the system state is perturbed when in the vicinity of this part of the attractor, it tends to return, as indicated by the grey arrows pointing back to the attractor.  The dashed part of the attractor is unstable -- perturbations in this neighborhood tend to move away from the attractor.  This graphical representation of the system makes it pretty easy to see how a small perturbation could dramatically change the system if the current combination of conditions and system state place the system on the attractor near the neighborhood where the attractor changes from stable to unstable.  The figure illustrates one such scenario.  The conditions/system state start at point F1. A small forcing perturbs the system off this point across the bifurcation.  Further forcing now moves the system way off the current state to some new, far away, stable state.  We go from a very high value of the system state to a very low value with only a very small change in conditions.  Indeed, in this figure, the conditions remain constant from point F1 to the new value indicated by the white point -- just a brief perturbation was sufficient to cause the drastic change.  I guess this is part of the definition of a catastrophe.

The real question in my mind, and one that others have asked, is how relevant is the fold catastrophe model for real systems?  This is something I'm going to have to think about. One thing that is certain is that this is a pedagogically very useful approach as it makes you think... and worry.

Stanford Workshop in Biodemography

On 29-31 October, we will be holding our next installment of the Stanford Workshops in Formal Demography and Biodemography, the result of an ongoing grant from NICHD to Shripad Tuljapurkar and myself.  This time around, we will venture onto the bleeding edge of biodemography.  Specific topics that we will cover include:

  • The use of genomic information on population samples
  • How demographers and biologists use longitudinal data
  • The use of quantitative genetic approaches to study demographic questions
  • How demographers and biologists model life histories

Information on the workshop, including information on how to apply for the workshop and a tentative schedule, can be found on the IRiSS website. We've got an incredible line-up of international scholars in demography, ecology, evolutionary biology, and genetics coming to give research presentations.

The workshop is intended for advanced graduate students (particularly students associated with NICHD-supported Population Centers), post-docs, and junior faculty who want to learn about the synergies between ecology, evolutionary biology, and demography. Get your applications in soon -- these things fill up fast!

Uncertainty and Fat Tails

A major challenge in science writing is how to effectively communicate real, scientific uncertainty.  Sometimes we just don't know have enough information to make accurate predictions.  This is particularly problematic in the case of rare events in which the potential range of outcomes is highly variable. Two topics that are close to my heart come to mind immediately as examples of this problem: (1) understanding the consequences of global warming and (2) predicting the outcome of the emerging A(H1N1) "swine flu" influenza-A virus.

Harvard economist Martin Weitzman has written about the economics of catastrophic climate change (something I have discussed before).  When you want to calculate the expected cost or benefit of some fundamentally uncertain event, you basically take the probabilities of the different outcomes and multiply them by the utilities (or disutilities) and then sum them.  This gives you the expected value across your range of uncertainty.  Weitzman has noted that we have a profound amount of structural uncertainty (i.e., there is little we can do to become more certain on some of the central issues) regarding climate change.  He argues that this creates "fat-tailed" distributions of the climatic outcomes (i.e., the disutilities in question).  That is, the probability of extreme outcomes (read: end of the world as we know it) has a probability that, while it's low, isn't as low as might make us comfortable.

A very similar set of circumstances besets predicting the severity of the current outbreak of swine flu.  There is a distribution of possible outcomes.  Some have high probability; some have low.  Some are really bad; some less so.  When we plan public health and other logistical responses we need to be prepared for the extreme events that are still not impossibly unlikely.

So we have some range of outcomes (e.g., the number of degrees C that the planet warms in the next 100 years or the number of people who become infected with swine flu in the next year) and we have a measure of probability associated with each possible value in this range. Some outcomes are more likely and some are less.  Rare events are, by definition, unlikely but they are not impossible.  In fact, given enough time, most rare events are inevitable.  From a predictive standpoint, the problem with rare events is that they're, well, rare.  Since you don't see rare events very often, it's hard to say with any certainty how likely they actually are.  It is this uncertainty that fattens up the tails of our probability distributions.  Say there are two rare events.  One has a probability of 10^{-6} and the other has a probability of 10^{-9}. The latter is certainly much more rare than the former. You are nonetheless very, very unlikely to ever witness either event so how can you make any judgement that the one is a 1000 times more likely than the other?

Say we have a variable that is normally distributed.  This is the canonical and ubiquitous bell-shaped distribution that arises when many independent factors contribute to the outcome. It's not necessarily the best distribution to model the type of outcomes we are interested in but it has the tremendous advantage of familiarity. The normal distribution has two parameters: the mean (\mu) and the standard deviation (\sigma).  If we know \mu and \sigma exactly, then we know lots of things about the value of the next observation.  For instance, we know that the most likely value is actually \mu and we can be 95% certain that the value will fall between about -1.96 and 1.96. 

Of course, in real scientific applications we almost never know the parameters of a distribution with certainty.  What happens to our prediction when we are uncertain about the parameters? Given some set of data that we have collected (call it y) and from which we can estimate our two normal parameters \mu and \sigma, we want to predict the value of some as-yet observed data (which we call \tilde{y}).  We can predict the value of \tilde{y} using a device known as the posterior predictive distribution.  Essentially, we average our best estimates across all the uncertainty that we have in our data. We can write this as

 p(\tilde{y}|y,\mu,\sigma) = \int \int p(y|\mu,\sigma) p(\mu,\sigma|y) d\mu d\sigma.

 

OK, what does that mean? p(y|\mu,\sigma) is the probability of the data, given the values of the two parameters.  This is known as the likelihood of the data. p(\mu,\sigma|y) is the probability of the two parameters given the observed data.  The two integrals mean that we are averaging the product p(y|\mu,\sigma)p(\mu,\sigma|y) across the range of uncertainty in our two parameters (in statistical parlance, "integrating" simply means averaging).  

If you've hummed your way through these last couple paragraphs, no worries.  What really matters are the consequences of this averaging.

When we do this for a normal distribution with unknown standard deviation, it turns out that we get a t-distribution.  t-distributions are characterized by "fat tails." This doesn't mean they look like this. What it means is that the probabilities of unlikely events aren't as unlikely as we might be comfortable with.  The probability in the tail(s) of the distribution approach zero more slowly than an exponential decay.  This means that there is non-zero probability on very extreme events. Here I plot a standard normal distribution in the solid line and a t-distribution with 2 (dashed) and 20 (dotted) degrees of freedom.

Standard normal (solid) and t distributions with 2 (dashed) and 20 (dotted) df.

We can see that the dashed and dotted curves have much higher probabilities at the extreme values.  Remember that 95% of the normal observations will be between -1.96 and 1.96, whereas the dashed line is still pretty high for outcome values beyond 4.  In fact, for the dashed curve,  95% of the values fall between -4.3 and 4.3. In all fairness, this is a pretty uncertain distribution, but you can see the same thing with the dotted line (where the 95% internal interval is plus/minus 2.09).  Unfortunately, when we are faced with the types of structural uncertainty we have in events of interest like the outcome of global climate change or an emerging epidemic, our predictive distributions are going to be more like the very fat-tailed distribution represented by the dashed line.

As scientists with an interest in policy, how do we communicate this type of uncertainty? It is a very difficult question.  The good news about the current outbreak of swine flu is that it seems to be fizzling in the northern hemisphere. Despite the rapid spread of the novel flu strain, sustained person-to-person transmission is not occurring in most parts of the northern hemisphere. This is not surprising since we are already past flu season.  However, as I wrote yesterday, it seems well within the realm of possibility that the southern hemisphere will be slammed by this flu during the austral winter and that it will come right back here in the north with the start of our own flu season next winter.  What I worry about is that all the hype followed by a modest outbreak in the short-term will cause people to become inured to public health warnings and predictions of potentially dire outcomes. I don't suppose that it will occur to people that the public health measures undertaken to control this current outbreak actually worked (fingers crossed).  I think this might be a slightly different issue in the communication of science but it is clearly tied up in this fundamental problem of how to communicate uncertainty.  Lots to think about, but maybe I should get back to actually analyzing the volumes of data we have gathered from our survey!

A Sign of the Times

Every time I go by the Stanford Shopping Center -- which is a truly absurd place, I should add -- I am reminded of an event that seems like an apt metaphor for the economic melt-down, the consequences of which we are only beginning to understand. I am, of course, talking about the replacement of Long Life Noodle House with Sprinkles Cupcakes.  Now, don't get me wrong.  Long Life Noodle House was a completely mediocre restaurant.  But it served real food.  You could go there, say, for dinner.  We did this on a regular basis, not because of its outstanding food, but because it was close, convenient, relatively inexpensive, and (given judicious choices) offered nutritious fare.  It also seems relevant to note that it was typically quite busy; it hardly seems like they were lacking for business. One night we went there after the kids' swim practice only to find it abruptly closed.  Within a month or so, the restaurant was replaced by this more than slightly ridiculous confectioner. 

Stanford Shopping Center replaced a restaurant that served real food with one that serves frivolous little confections as the United States substituted innovation and production for financial gimmickry. 

On Freeman Dyson's Climate-Change Skepticism

A nice piece by Nicholas Dawidoff in the New York Times Magazine this week details the eminent physicist Freeman Dyson's skepticism about the dangers of global warming. It seems that Mr. Dyson is concerned about the quality of the science that underlies the current scientific consensus about its perils.

One gathers from reading the Dawidoff piece that the major criticism Dyson levies climate science is against the computer models of Earth's climate that provide much of the information we have about how Earth will respond to increased atmospheric concentrations of carbon dioxide and other greenhouse gasses (these increases are a fact that is not in dispute). The rub of planetary science is that planet-scale experiments are (for now) impossible (I think we have a way to go before the musings of Kim Stanley Robinson or Kevin J. Anderson come to pass). Our power to understand planetary processes is constrained by our N=1. There is only one Earth. It could be argued that our N is actually closer to three when you throw Venus and Mars into the mix, but the fact remains, our sample size of known planets is pretty small. My U Penn colleague David Gibson made an observation at the NAS/CNRS Frontiers of Science conference last November that studying planets is kind of like studying revolutions. I think some of the physical scientists in the room were scandalized by this vulgar analogy but I (and the other token social scientist in the room) think he made a terrific observation. In both cases, we have a very small number of relatively well-understood and, for all we know, completely eccentric cases and the poverty of this sample makes generalization highly problematic.

So what are our options for studying Earth's climate other than computer models? I wholeheartedly agree that science is jeopardized whenever the scientist falls in love with his or her model. But there are, in fact, lots of models and these models are run by lots of independent groups emphasizing different aspects of the global circulation system in their particular specifications. It's almost like science, actually. When one model makes an outlandish prediction, I don't pay much attention. When all the models make that same outlandish prediction, I pay attention to it, no matter how crazy it might be. Note that this does not mean it's correct. It does mean that the result merits attention.

Mr Dyson, it seems, thinks that global warming is a good thing. Increased atmospheric concentration of CO2 will increase plant productivity. At the very least, all we need to do to ameliorate putative negative effects of increased CO2 would be to plant lots (and lots) of super carbon-scrubbing trees (which apparently are just waiting to be genetically engineered). There are quite a few problems with this proposal. First, it is actually not completely clear that increases in CO2 will globally increase plant productivity. Ask a plant ecologist and she will tell you that there are other things that limit plant growth than CO2 (e.g, water, nitrogen, phosphorus, etc.). Then there is the fact that ecological enrichment experiments very frequently lead to decreases in biodiversity. Plants that are very good competitors for a particular resource thrive at the expense of plants that are not good competitors for that resource (but might be superior along other dimensions). There are lots of other issues that complicate the seemingly simple relationship between CO2 concentration and productivity such as an increase in ground-level ozone, ocean acidification, and the fact increased temperatures can reduce production independent of CO2 concentration. For someone who is so critical of sloppy science, it seems that Mr. Dyson needs to bone up a bit on his physiological ecology.

Dawidoff quotes Dyson as saying that 'Most of the evolution of life occurred on a planet substantially warmer than it is now.' Of course, the rub is that humans evolved in a cool planet. Many of the major events that have characterized the evolution of our species are thought to have involved cooling and drying (e.g., see the work of Steven Stanley or Elizabeth Vrba). What brought the first hominins out of the forest to walk bipedally across the entire planet? Probably climatic cooling and drying which broke tropical forests in Africa up into savanna mosaics. There is a very real sense in which humans are the cold-adapted ape.  I have little doubt that life of some sort will continue even in the most nightmarish of climate-change scenarios. The more parochial question that I think most people care about is: what about human life? An important addendum to this question is: what about the life that we care about?

I applaud Dyson's contempt for orthodoxy and I admit a dis-ease that I feel among global-warming zealots. The problem with this particular windmill that he has chosen to tip at is that there are powerful economic and political interests that seek to subvert whatever good science is done in global change research for their own ends. Dyson ends up abetting the disinformationists and thereby supporting a much deeper orthodoxy than that of the marginalized community of scientists. This deeper orthodoxy is, of course, the neoliberal ideology that market forces are always preferred to scientifically-informed regulation, pecuniary reward always trumps gains in any other value system, growth-above-all, lie back and think of mother England, etc.

I think that zealotry is spawned by the difficulty of being taken seriously, especially when truth is, well, inconvenient. The loud and persistent mouths of activists are what keep ideas in the public consciousness. Global warming and its consequences are of the sort of scale that they are all too easily ignored. But I fear (and many other scientists share this fear) that we ignore the problem at our peril.

Speaking as someone who typically has an infantile response to group-think, my guess is that Dyson hangs around with a select crowd. In places like Princeton, NJ or Cambridge, MA or Palo Alto, CA, it's easy to get the impression that everyone is a raving environmentalist (or at least wants others to think they are -- a subject for a later post). I am reminded of the probably apocryphal (but so canny) story of the befuddled Democrat (Hollywood screen-writer, Manhattan socialite, Cambridge intellectual -- I've heard versions using each), incredulous that Nixon could have won the 1972 election in a landslide, who uttered the immortal line, "but everyone I know voted for McGovern!" It's all too easy in university towns like these to lose track of the fact that most people don't really give a damn about global warming (or, while we're at it, poverty, nuclear proliferation or science) and won't until it has an undeniable impact on their lives. To see that acute concern over the impacts of global warming is not really part of some grand orthodoxy, perhaps Mr. Dyson should spend some time at the Heritage Foundation, Cato Institute, or, for that matter, just about any town in the United States besides Princeton!

A big part of Dyson's critique, it seems, is that we don't have enough information. Given the intractability of global experiments and a general discontent with general circulation models, we are going to need to live with a considerable amount of uncertainty. Harvard economist Martin Weitzman has written a very thought-provoking (and technically demanding) paper on the subject of cost-benefit analysis in the context of global climate change. He refers to the climate change situation as one characterized by "deep structural uncertainty."  In a follow-up paper (in which he responds to criticisms from Yale economist, William Nordhaus), Weitzman makes the astute observation that inductive science is of limited utility when the object of study is an extremely rare event.  The world has not seen atmospheric concentrations of CO2 like what we will see in the near future in a very long time (at least 800,000 years) and we really know very little about such a world. It is very, very difficult to scientifically study extremely rare events.  This is the basis of our deep structural uncertainty and the reason that Mr. Dyson's plea for gathering more data is unlikely to help all that much with decision-making.

Weitzman further notes that the most severely negative outcomes of global warming are unlikely.  Unfortunately, our systematic uncertainty over the likely course of atmospheric greenhouse gas accumulation, the functional response global climate to this accumulation, or the parameters of the different models of climate change means that these unlikely events are less unlikely than they would be if we knew more.  Uncertainty compounds.  (This probably merits its own blog posting but Spring Break is nearly over...) The probability distribution of future outcomes is "fat-tailed." This means that the probability of truly catastrophic outcomes is not trivial.  A "long-tailed" distribution means that extreme events are possible but only vanishingly probable.  A fat-tailed distribution means that unlikely events are more likely than we might be comfortable with. Weitzman concludes that, given the fat tail of outcomes-of-global-warming distribution, a sensible cost-benefit analysis favors strong action to mitigate the future effects of this looming problem.

It's such a shame that a man of science of the stature of Freeman Dyson is spending his time (apparently) unwittingly abetting the cause of anti-science and the neoliberal status quo. In contrast, I find Weitzman's perspective very sensible indeed.  When we put our egos aside, we have to acknowledge the fact that there is huge amount of -- probably intractable -- uncertainty surrounding the future of global warming.  When there is a small (but non-trivial) probability of a catastrophic event, does it not seem prudent to take steps to avoid catastrophe?