An article from the New York Times writer Gina Kolata from June 27th, is getting a fair bit of buzz, both around the blogosphere and among my friends on facebook (many of whom are scientists as well).
The gist of it is as follows. The current trend among large scientific research grant agencies is to fund projects that "play it safe". That is, they will not be risky projects, with respect to having significant "productive" output in terms of research articles out the other side (i.e. funds = research articles). Those proposals that may be the most ground breaking (both in terms of basic research, as well as any potential for significant clinical advances) are also often the most risky. This article does a good job getting at the heart of both the political and cultural components of this issue.
However it got me thinking about the culture of science in general, and our mentorship process. In particular about how a major part of the training of scientists with respect to critical reasoning, also leads perhaps to excessive skepticism. Is this possible? Now, I tend to be an overly skeptical person, and like most scientists, and I often look for the flaws in all of the experiments I perform. However, is it possible that we take it to far as a scientific community?
From my own training experience, I know that some of the most valuable time spent was in "Journal club", where a group of students, post-docs and faculty would get together each week over coffee, and argue about a couple of recent papers. However, in most situations, this would turn in to a session to find every possible flaw in the study. While there is certainly value in this (knowing a good experiment from a bad one for instance), I am know wondering if this does not lead to a culture of scientists who are unable to take risks, or appreciate proposals for "risky" science?
This is something I will have to mull over.....