Tag Archives: science

That's How Science Works

The RealClimate blog has a very astute entry on how the controversy surrounding the recent report in the prestigious journal Science that bacteria living in the arsenic-rich waters of Mono Lake in California can substitute arsenic for phosphorous in their DNA.  If true, this would be a major finding because it expands the range of environments in which we could conceivably find extraterrestrial life.  In effect, this result would suggest a wider range of building blocks for life.  Pretty heavy stuff. Now, I am way out of my depth on this topic, but it sounds like the paper published in Science suffers from some fairly serious problems. Some of the problems noted by experts in the field have been assembled by Carl Zimmer on his blog.  Carl also provides a pithy treatment of the controversy in an article at Slate.com. John Roach has a similarly excellent review of the controversy, including what we learn about science from it on his Cosmic Log blog.

Regardless of the scientific merits of this work, this episode is actually an instructive example of the way that science works. As the RealClimate folks write,

The arseno-DNA episode has displayed this process in full public view. If anything, this incident has demonstrated the credibility of scientists, and should promote public confidence in the scientific establishment.

The post then goes on to list three important lessons we can draw from this whole incident:

  1. “Major funding agencies willingly back studies challenging scientific consensus.” It helps if the challenge to scientific consensus is motivated by carefully reasoned theoretical challenges or, even better, data that challenge the consensus.  Some yahoo saying that evolution is “just a theory” or that climate change isn’t real because it was really cold last winter isn’t enough. In the case of arseno-DNA, as Carl Zimmer notes, the National Academy of Sciences published a report in 2007 that suggested the theoretical possibility of arsenic-based biology.  Carl also notes that some of the authors of this report are highly critical of the Science paper as well. The report challenged the orthodoxy that phosphate was a necessary building block of DNA, and the report’s author’s later called out NASA (the major funding source for this kind of extreme biology) for publishing sloppy science.  Lots of orthodoxy being challenged here…
  2. “Most everyone would be thrilled to overturn the consensus. Doing so successfully can be a career-making result. Journals such as Science and Nature are more than willing to publish results that overturn scientific consensus, even if data are preliminary – and funding agencies are willing to promote these results.” Individual scientists have enormous individual and institutional incentives to overturn orthodoxies if it is within their power. You become a star when you pull this feat off. And you better believe that every funding agency out there would like to take credit for funding the critical research that helped overturn a fundamental scientific paradigm.
  3. “Scientists offer opinions based on their scientific knowledge and a critical interpretation of data. Scientists willingly critique what they think might be flawed or unsubstantiated science, because their credibility – not their funding – is on the line.” As a scientist, you have to do this if you are going to be taken seriously by your peers — you know, the ones who do all that peer review that climate deniers, e.g., seem to get their collective panties in a wad about?

The RealClimate piece summarizes by noting:

This is the key lesson to take from this incident, and it applies to all scientific disciplines: peer-review continues after publication. Challenges to consensus are seriously entertained – and are accepted when supported by rigorous data. Poorly substantiated studies may inspire further study, but will be scientifically criticized without concern for funding opportunities. Scientists are not “afraid to lose their grant money”.

Read the RealClimate post to get the full story. Obviously, these authors (who do excellent science and amazing public education work, a rare combination) are interested in what this controversy has to say about accusations of bias in climate science — check out the RealClimate archives for some back-story on this. However, the post is so much more broadly applicable, as they note in the quote above. Science is not a monolithic body of information; it is a process, a system designed to produce positive (as opposed to normative) statements about the world around us. When it works correctly, science is indifferent to politics or the personal motivations of individual scientists because results get replicated.  Everything about a scientific paper is designed to allow other researchers to replicate the results that are presented in that paper.  If other researchers can’t replicate some group’s findings, those findings become suspect (and get increasingly so as more attempts to replicate fail).

So what does this mean for Anthropology as a science? You may remember that there has been some at times shrill “discussion” (as well as some genuine intellectual discussion) about the place for science in Anthropology and the American Anthropological Association in particular. For me, replicability is a sine qua non of science. The nature of much anthropological research, particularly research in cultural anthropology, makes the question of replication challenging. When you observe some group of people behaving in a particular way in a particular place at a particular time, who is to say otherwise? I don’t claim to have easy answers here, but there are a few things we can do to ensure the quality of our science.

First, we need to have scientific theories that are sufficiently robust that they can generate testable predictions that transcend the particularities of time and place. Results generated in one population/place/time can then be challenged by testing in other populations/places/times. Of course, it is of the utmost importance that we try to understand how the differences in population and place and time will change the results, but this is what our research is really about, right?  When we control for these differences, do we still see the expected results?

Second, we need to be scrupulous in our documentation of our results and the methods we employ to generate these results.  You know, like science? It’s never easy to read someone else’s lab notebook, but we need to be able to do this in anthropology, at least in principle.  Going back to the raw data as they are reduced in a lab notebook or its equivalent is probably the primary means through which scientific fraud is discovered. Of course, there are positive benefits to having scrupulously-kept field notes as well.  They serve as a rich foundation for future research by the investigator, for instance.

Third, we need to be willing to share our data. This is expected in the natural sciences (in fact, it is a condition for publication in journals like Science and Nature) and it should be in Anthropology as well.

I think that the points of the RealClimate post all apply to anthropology as well. Surrounding the latest brouhaha on science in anthropology, one hears a lot of grousing about various cartels (e.g., the AAA Executive Board, the editorial boards of various journals, etc.) that keep anthropologists of different strips (yes, it happens on both sides) from receiving grants or getting published or invited to serve on various boards, etc. Speaking from my experience as both panelist and applicant, I can confidently say that the National Science Foundation’s Cultural Anthropology Program funds good cultural anthropology of a variety of different approaches (there are also other BCS programs that entertain, and sometimes fund, applications from anthropologists) and the panel will happily fund orthodoxy-busting proposals if they are sufficiently meritorious.  The editorial position of American Ethnologist not in line with your type of research?  If you’ve done good science, there are lots of general science journals that will gladly take interesting and important anthropology papers (and, might I add, have much higher impact factors). I co-authored a paper with Rebecca and Doug Bird that appeared in PNAS not too long ago. Steve Lansing has also had a couple nice papers in PNAS as does Richard McElreath, or Herman Pontzer, or … a bunch of other anthropologists!  Mike Gurven at UCSB has had some luck getting papers into Proceedings of the Royal Society B.  Mhairi Gibson and Ruth Mace have papers in Biology Letters and PLoS Medicine.  Rebecca Sear has various papers in Proceedings of the Royal Society B. Monique Borgerhoff Mulder and a boat-load of other anthropologists (and at least one economist) have a paper in Science. Ruth Mace has papers in most of these journals as well as at least one in Science. Rob Boyd, Richard McElreath, Joe Henrich, and I all even have papers about human social behavior, culture, etc. in theoretical biology journals such as Theoretical Population Biology and the Journal of Theoretical Biology. There’s lots more.  As with my previous post, this is a total convenience sample of work with which I am already familiar. The point is that there are outlets for good scientific anthropology out there even if people like me are unlikely to publish in journals like PoLAR.

So, I’m sanguine about the process of science and the continuing ability for anthropologists to pursue science. My winter break is drawing to a close and I’m going to try to continue some of this myself!

Risk-Aversion and Finishing One's Dissertation

It’s that time of the year again, it seems, when I have lots of students writing proposals to submit to NSF to fund their graduate education or dissertation research.  This always sets me to thinking about the practice of science and how one goes about being a successful scientist. I’ve written about “productive stupidity” before, and I still think that is very important. Before I had a blog, I composed a series of notes on how to write a successful NSF Doctoral Dissertation Improvement Grant when I saw the same mistakes over and over again sitting on the Cultural Anthropology panel.

This year, I’ve find myself thinking a lot about what Craig Loehle dubbed “the Medawar Zone.” This is an nod to the great British scientist, Sir Peter Medawar, whose book, The Art of the Soluble: Creativity and Originality in Science, argued that best kind of scientific problems are those that can be solved.  In his classic (1990) paper Loehle argues that “there is a general parabolic relationship between the difficulty of a problem and its likely payoff.” Re-reading this paper got me to thinking.

In Loehle’s figure 1, he defines the Medawar Zone.  I have reproduced a sketch of the Medawar Zone here.

medawar-zoneNow, what occurred to me on this most recent reading of this paper is that for a net payoff curve to look like this, the benefits with increased difficulty of the problem are almost certainly concave.  That is, they show diminishing marginal returns to increased difficulty.  Hard to say what the cost curve with difficulty would be – linear? convex? Either way, there is an intermediate maximum (akin to Gadgil and Bossert‘s analysis of intermediate levels of reproductive effort) and the best plan is to pick a problem of intermediate difficulty because that is where the scientific benefits, net of the costs, are maximized.

Suppose that a dissertation is a risky endeavor.  This is not hard for me to suppose since I know many people from grad school days who had at least one failed dissertation project.  Sometimes this led to choosing another, typically less ambitious project.  Sometimes it led to an exit from grad school, sans Ph.D.  Stanford (like Harvard now, but not when I was a student) funds its Ph.D. students for effectively the entirety of their Ph.D.  This is a great thing for students because nothing interferes with your ability to think and be intellectually productive than worrying about how you’re going to pay rent.  The downside of this generous funding is that students do not have much time to come up with an interesting dissertation project, write grants, go to the field, collect data, and write up before their funding runs out. So, writing a dissertation is risky.  There is always a chance that if you pick too hard a problem, you might not finish in time and your funding will run out. Well, it just so happens that the combination of a concave utility function and a risk of failure is pretty much the definition of a risk-averse decision-maker.

Say there is an average degree of difficulty in a field.  A student can choose to work on a topic that is more challenging than the average but there is the very real chance that such a project will fail and in order for the student to finish the Ph.D., she will have to quickly complete work on a problem that is easier than the average.  Because the payoff curve with difficulty is concave, it means that the amount you lose relative to the mean if you fail is much greater than the amount you gain relative to the mean if you succeed.  That is, your downside cost is much greater than your upside benefit.

risk-aversionIn the figure, note that d1>>d2.  Here, I have labeled the ordinate as w, which is the population genetics convention for fitness (i.e., the payoff).  The bar-x is the mean difficulty, while x2 and x1 are the high and low difficulty projects respectively.

The way that economists typically think about risk-aversion is that a risk-averse agent is one who is willing to pay a premium for certainty.  This certainty premium is depicted in the dotted line stretching back horizontally from the vertical dashed line at x=xbar to the utility curve.  The certain payoff the agent is willing to accept vs. the uncertain mean is where this dotted line hits the utility curve. Being at this point on the utility curve (where you have paid the certainty premium) probably puts you at the lower end of the Medawar Zone envelope, but hopefully, you’re still in it.

I think that this very standard analysis actually provides the graduate student with pretty good advice. Pick a project you can do and maybe be a bit conservative.  The Ph.D. isn’t a career – it’s a launching point for a career. The best dissertation, after all, is a done dissertation.  While I think this is sensible advice for just about anyone working on a Ph.D., the thought of science progressing in such a conservative manner frankly gives me chills.  Talk about a recipe for normal science!  It seems what we need, institutionally, is a period in which conservatism is not the best option. This may just be the post-doc period.  For me, my time at the University of Washington (CSSS and CSDE) was a period when I had unmitigated freedom to explore methods relevant to what I was hired to do.  I learned more in two years than in – I’d rather not say how many – years of graduate school. The very prestigious post-doctoral programs such as the Miller Fellowships at Berkeley or the Society of Fellows at Harvard or Michigan seem like they are specifically designed to provide the environment where the concavity of the difficulty-payoff curve is reversed (favoring gambles on more difficult projects).

There is, unfortunately, a folklore that has diffused to me through graduate student networks that says that anthropologists need to get a faculty position straight out of their Ph.D. or they will never succeed professionally.  This is just the sort of received wisdom that makes my skin crawl and, I fear, is far too common in our field.  If our hurried-through Ph.D.s can’t take the time to take risks, when can we ever expect them to do great work and solve truly difficult problems?

More on Science in the Obama Times

As a follow-up to my post on science and the Obama Inaugural, I wanted to note a terrific essay  by Dennis Overbye on the civic virtues of science in the New York Times. He argues that virtue emerges from the process of science: “Science is not a monument of received Truth but something that people do to look for truth.”  Continuing, he writes,

That endeavor, which has transformed the world in the last few centuries, does indeed teach values. Those values, among others, are honesty, doubt, respect for evidence, openness, accountability and tolerance and indeed hunger for opposing points of view. These are the unabashedly pragmatic working principles that guide the buzzing, testing, poking, probing, argumentative, gossiping, gadgety, joking, dreaming and tendentious cloud of activity — the writer and biologist Lewis Thomas once likened it to an anthill — that is slowly and thoroughly penetrating every nook and cranny of the world.

There is a certain egalitarian, round-table ethos to science done well.  It doesn’t matter what degrees you have or where from.  What matters is whether you ask and answer interesting questions. Of course, institutions that support science frequently care about degrees and where they’re from, but in my experience, good scientists don’t. While there are certainly barriers to entry (e.g., the cost of higher education, the difficulty of mastering a subject), there is no fundamentally esoteric knowledge in science.  When it’s working right, everything is transparent.  It has to be because no one will believe you unless it can be repeated.

I certainly hope the rhetoric of respect for science and the idea that empirical research will inform policy continues and gets translated into tangible support for research in the coming years.

Data, Statistics, Science, Imagination and Common Purpose

In President Obama’s Inaugural Address, “data” and “statistics” were the 247th and 249th words spoken. Science was very much foregrounded in the President’s address:

We will restore science to its rightful place and wield technology’s wonders to raise health care’s quality and lower its costs.

We will harness the sun and the winds and the soil to fuel our cars and run our factories. And we will transform our schools and colleges and universities to meet the demands of a new age.

All this we can do. All this we will do.

There is tremendous congruence between this stated respect for science and the somber chastisement over our collective “failure to make hard choices and prepare the nation for a new age.”  The Bush administration sought to suppress science because facts about the world can be politically inconvenient.  The implications of scientific research don’t always jibe so well with our desire for short-term gratification.  I hope that President Obama can truly help to focus our political debates onto the serious decisions that we need to make as individuals and as a society.  

I am thrilled by the prospect that the age of know-nothingness in Washington DC might be over, but am also realistic that these things take time.  Let’s hope we can make this change while we still actually have time!

Regaining a Science and Technology Edge

Here’s a crazy idea from venture capitalist John Doerr: Don’t kick foreign students whom we have trained in science and engineering at our elite universities out of the country after they graduate.  Let them work in the United States where their education has almost certainly been subsidized in some way by the government and, ultimately, American taxpayers — “staple a green card to the diploma” as it were. This guy is nuts.  That is way too sensible…

On Productive Stupidity

This essay by UVA cell biologist, Martin Schwartz, pretty much encapsulates the way I feel about the practice of science.  If I perfectly understand everything I’m doing at any given moment, something is wrong.  I want to be uncomfortable in my understanding of any given question I am asking or method that I am employing.  Otherwise, I don’t think that I would be growing as either a scientist and humanist.

Scientific perspectives in Anthropology are increasingly rare. This past year, I sat on our department’s graduate admissions committee and I was struck by a theme that emerged in the personal statements prospective students made.  They really had it all figured out.  A typical essay would have the form “At Stanford I will expand on topic X and show Y.”  Sure, they’d learn probably some rhetorical tricks and gather some social capital along the way, but what more did they really need to know about the world around them? My perspective on this was how can you know what you will show if you haven’t even designed your study or collected data?  It would be so refreshing to read a personal statement that took the form “Isn’t it funny the way X does Y?  I wonder why that is.” The Jerry Seinfeld approach to science, I suppose. Quoting Schwartz’s essay,

Productive stupidity means being ignorant by choice. Focusing on important questions puts us in the awkward position of being ignorant. One of the beautiful things about science is that it allows us to bumble along, getting it wrong time after time, and feel perfectly fine as long as we learn something each time. No doubt, this can be difficult for students who are accustomed to getting the answers right. No doubt, reasonable levels of confidence and emotional resilience help, but I think scientific education might do more to ease what is a very big transition: from learning what other people once discovered to making your own discoveries. The more comfortable we become with being stupid, the deeper we will wade into the unknown and the more likely we are to make big discoveries.

Perhaps we can foster a future generation of productively stupid anthropologists here in the Ecology and Environment program within the Anthropology department.  Fostering stupidity in a world too full or arrogant certitude may be one of the greatest challenges facing the academy of the twenty-first century.  Here’s to bumbling…