#Overly Honest Methods or PhD Madness?

Bookmark and Share

This week, scientists took to the social networking site Twitter to swap candid examples of what really goes on behind the lab’s closed doors.  Their confessions, under the Twitter hashtag #OverlyHonestMethods, are becoming a rapidly growing collection of humorous anecdotes that reflect the realities of scientific decision-making, particularly within the life and physical sciences. Overall, it reads like a catalogue of methodological misdemeanours.

Although it has since spread to Facebook, blogs and other sites, as with all social media trends or “memes”, the popularity of #OverlyHonestMethods is likely to soon die down.  However, I cannot help but wonder whether, after the dust (and amusement) settles, the status of, and public trust in, the scientific method will have been challenged.  On the other hand, this surprising candour might actually be good for science.  Either way, the scientific community needs also to have a more serious discussion about this topic.

First, let’s have a look at some examples.  One post – or “tweet” (in Twitter terminology) – sees a scientist confessing that she “selected a particular qPCR [real-time polymerase reaction] mastermix because the rep was giving away legomen USB sticks”!  Admittedly, I myself am a sucker for freebies.  Goodness knows I have signed up to far too many things that I might otherwise have avoided on the basis I was suckered in because of a rep offering a free mug, or indeed a novelty USB stick.  As a social scientist, I don’t have to buy reagents or chemicals.  However, if I did, I would at least like to think that my choice was based on a well-informed assessment of the suitability of all the relevant reagents available to me.  Not (necessarily) the one that comes with the coolest USB stick.

Whilst posts like this one hint at the some of the more arbitrary choices that lab scientists sometimes make, other are posts are more alarming to me, because they hint at some of the more unprofessional and even “unscientific” things that go on in the lab.  For example, one tweet alludes to the fact that “data are available on request because then we can tidy the spreadsheet only if absolutely necessary”.  Another claimed that his “experiment was repeated until we had three statistically significant similar results and could discard the outliers”.

I guess the worrying thing about #OverlyHonestMethods is that they may be just that – overly honest!  There aren’t that many posts which seem so incredulous so as to be fabricated entirely.  Of course, it’s their very plausibility that makes them funny … to other scientists that is.

However, if even a proportion of these “trade secrets” are indeed true, then we need to stop for a second and ask ourselves what implications might they have for how the standards of the scientific method, which are currently so crucial to the epistemic authority of science?  Science is, and always has been, built on its foundational principles of objectivity, reliability, replicability and validity.

I wonder how many journal editors would see fit to consider a retraction if the information submitted to Twitter was submitted directly to them, and it related to a paper they had previously published?

This consideration leads me to a related question: how will these confessions be received by the public, who generally speaking, trust that scientific research returns objective results precisely because it adheres to its principles of reliability, replicability and validity?  Of course, whilst most non-scientists are unlikely to get many of the in-jokes in these anecdotes, neither are they likely to find funny the fact that scientists are alluding to questionable use of their research funds.  Particularly since many of these scientists are likely funded by the public and/or state funds.  As one tweet claimed:  “Functional magnetic resource imaging was performed because we had to justify this large grant somehow”.  Although one can assume (or hope) the tweeter was joking, in a post-recession climate where resources are strained, many may fail to see the funny side.
Maybe #OveryHonestMethods is simply a group of scientists arguing that the scientific method, like anything in life, isn’t as perfect as it is sometimes made out to be.  After all, scientists are human, and humans make decisions that are sometimes less-than-objective.  This wouldn’t necessarily be a new argument. Sociologists of Science, like Harry Collins and Bruno Latour have long been making the argument that natural science, including its methodological decisions, is shaped by social, political and cultural factors and interests, pretty much like any other activity or practice.  The historian of science Steve Shapin discusses to what he calls “idealized methods stories”.  This is where the contingencies, uncertainties, ambiguities and all-round messiness of the reality of decision-making in the lab are hidden from the sanitized end product – the published article.    The big difference with #OverlyHonestResearchMethods is that it is now the natural and life scientists themselves who are making this point.  And for all the world to see.

Costs of Candour?

Illustration: Mad Scientist source: Wikipedia

Illustration: Mad Scientist source: Wikipedia

 

Who knows, this candour might ultimately be good for science in the long run.  It might finally serve to make visible the hitherto “behind-closed-doors” practices which have always been a part of science.  Maybe many of these examples are necessarily and realistically a part of what makes science science?

 

 

 

 

Maybe we have traditionally held science to unrealistically high standards that it cannot meet?  Maybe we need a reality check.

It might however also represent the very beginnings of a “revolution” in how scientists and the public alike, view and talk about the scientific method.  Stranger things have happened as a result of social media!
Here in United States, some consider the website Wikileaks to be a threat to national security.  One could go so far as to argue that #OverlyHonestMethods is a threat to scientific security. In an age of evidence-based policy and practice, the legitimacy of many policies are themselves tied to the legitimacy of the science upon which they are based.

For the time being, one thing I can be sure of is that, if I were grad student, or a scientist seeking a job, I would think twice about posting any of my methodological misdemeanours.  In a time when competition for jobs in science is tighter than ever, publicly admitting to “pushing buttons in our favourite stats software until all our results had stars next to them” or “this dye was used because the bottle was within reach”, might not be the wisest decision in the world.

Dr. Simon Williams is a Research Associate at the Feinberg School of Medicine, Northwestern University, in Chicago, Illinois.  As a member of the Scientific Careers Research and Development Group, he is interested in issues of scientific training and careers, as well as broader issues related to scientific knowledge and the scientific method.

 

Related Posts Plugin for WordPress, Blogger...

Creative Commons License
#Overly Honest Methods or PhD Madness? by PLOS Blogs Network, unless otherwise expressly stated, is licensed under a Creative Commons Attribution 4.0 International License.

This entry was posted in PLoS. Bookmark the permalink.