Background
When Futurity.org, a new science news service, was launched last week, there was quite a lot of reaction online.
Some greeted it with approval, others with a “wait and see” attitude.
Some disliked the elitism, as the site is limited only to the self-proclaimed “top” universities (although it is possible that research in such places, where people are likely to be well funded, may be the least creative).
But one person – notably, a journalist – exclaimed on Twitter: “propaganda!”, which led to a discussion that revealed the journalist’s notion that press releases are automatically suspect and scientists are never to be trusted and their institutions even less. That was a very anti-science sentiment from a professional science journalist, some of us thought.
This exchange reminded me of a number of prior debates between the traditional Old Media journalists and the modern New Media journalists about the very definition of ‘journalism’. The traditional journalists are fighting to redefine it in a narrowest possible way that keeps them in a position of gatekeepers (like the new proposed shield law that defines a journalist as someone who gets paid by the Old Media organization, thus NOT protecting citizen journalists, accidental journalists, bloggers, etc.), while the new ones are observing the way the world is changing and trying to come up with new definitions that better reflect the world (and often go too far in the other direction – defining everything broadcast by anyone via any medium to the audience consisting of more than one person as journalism, including the crossword puzzle in a newspaper and the silliest YouTube video).
One of the frequently heard retorts in the “you’ll miss us when we’re gone” genre of defensiveness by the old guard is the slight-of-hand in which they suddenly, in mid-stream of the discussion, redefine journalism to equate only investigative journalism. This usually comes up in the form of “who will report from the school board meetings” question (to which the obvious answer is: “actually, the bloggers are already doing it a lot as the old media has quit decades ago”).
Of course, investigative journalism is just one of many forms under the rubric of ‘journalism’. And, if you actually go and buy a copy of your local newspaper today (it still exists in some places, on tree-derived paper, believe me), you are likely to find exactly zero examples of investigative journalism in it. Tomorrow – the same. Every now and then one appears in the paper, and then it is often well done, but the occasions are rare and getting even more rare as investigative reporters have been cut from many a newsroom over the past few decades, and even more rapidly over the last several months.
So, what is ‘Investigative Science Journalism’?
So, this train of thought brought me to the question, again, of what is ‘investigative journalism’ in science. And I was not perfectly happy with what I wrote about this question before. I had to think some more. But before doing all the thinking myself, I thought I’d try to see what others think. So I tweeted the question in several different ways and got a lot of interesting responses:
Me: What is, exactly, ‘investigative science reporting’?
@davemunger: @BoraZ To me, it means going beyond looking at a single study to really understand a scientific concept. Diff from traditional “inv. journo”
@szvan: @davemunger @BoraZ And looking at methodology, statistical analysis, etc. to determine whether claims made match what was studied.
@LeeBillings: @BoraZ Re: “investigative science reporting,” isn’t it like all other investigative reporting where you dig deep and challenge your sources?
@Melhi: @BoraZ I thnk it means, “we cut/pasted from Wiki, all by ourselves.” Seems to be what it means when “scientific” is removed from the term.
Me: @LeeBillings clarify: What’s the story about? dig deep into what? who are the sources? why are you assuming they need to be challenged?
@soychemist: @BoraZ Any instance in which a reporter tries to uncover scientific information that has been concealed or distorted, using rigorous methods
@john_s_wilkins: @BoraZ Reporting on investigative science, no doubt.
@LeeBillings: @BoraZ ?s you’re asking only make sense in context of a specific story, not in context of defining “sci investigative journalism” as a whole
@LeeBillings: @BoraZ 1/2 but typically, the goal is to find out what’s true, and communicate it. you dig into primary literature & interview tons of ppl
@LeeBillings: @BoraZ 2/2 you don’t assume they need to challenged. you *know* they need to be challenged based on your in-depth research into primary lit
Me: When futurity.org was released, a journo yelled “propaganda”! Does every press release need to be investigated? Challenged?
Me: Are scientists presumed to be liars unless proven otherwise? All of them?
@NerdyChristie: Usually. Unless you’re studying how herbal tea makes you a supergod. RT @BoraZ: Are scientists presumed to be liars unless proven otherwise?
@szvan: @BoraZ Not liars but not inherently less open to bias than anyone else. Some wrongs are lies. Some are errors.
Me: Are journalists capable of uncovering scientific misconduct at all? All of those were uncovered by other scientists, I recall…
@lippard: @BoraZ Didn’t journalist Brian Deer do the investigative work to expose Andrew Wakefield’s MMR-autism data manipulation?
@JATetro: @BoraZ To be honest, there are some very good journalists out there who can spot misconduct but without backing from a source, it’s liable.
Me: @BoraZ: @JATetro yes, they need scientists to do the actual investigating, then report on what scientists discovered – fraud, plagiarism etc.
@JATetro: @BoraZ So it’s not the journalists fault, really. They do their job as well as possible but without our help, there’s little they can do.
@LabSpaces: @JATetro @BoraZ Actual scientists cost too much.They’re a luxury, and especially in these times, it’s hard for pubs. to justify having 1
@JATetro: @LabSpaces @BoraZ Apparently it’s hard for universities to have them as well…not a prof or anything but damn it’s ridiculous.
@LabSpaces: @JATetro @BoraZ I dunno, our PR dept. does a great job interacting with scientists and getting the right info out, but I guess that’s diff.
@JATetro: @LabSpaces @BoraZ Oh, the media people at the U are great. It’s the administrators that seem to forget who keep the students comin’.
Me: Isn’t investigating nature, via experimentation, and publishing the findings in a journal = scientific investigative reporting?
@LeeBillings: @BoraZ 1/2 I’d say that’s performing peer-reviewed scientific research, not doing investigative science journalism.
@LeeBillings: @BoraZ 2/2 No room to address your ?-torrent. What are you driving at, anyway? You think sci journos can’t/don’t do investigative stuff?
@LouiseJJohnson: RT @BoraZ Isn’t investigating nature, via experimentation, & publishing findings in a journal, scientific investigative reporting?
@mcmoots: @BoraZ “Journalism” usually means you report the results of your investigations to the public; scientists report to a technical community.
Me: @BoraZ: @mcmoots does the size and expertise of audience determine what is journalism, what is not? Is it changing these days?
Me: @BoraZ: Why is investigating words called ‘investigative journalism’, but investigating reality, with much more rigorous methods, is not?
@LeeBillings: @BoraZ 1 more thing: A press release isn’t a story–it should inspire journos to look deeper. Sometimes that deeper look reveals PR to be BS
Me: @BoraZ: @LeeBillings Journal article is reporting findings of investigation. Press release is 2ndary. Journo article is 3tiary. Each diff audience.
@LeeBillings: @BoraZ Glad you raised ? of audience, since relevant to yr ? of “words” & “reality.” Words make reality for audiences, some more than others
Me: @BoraZ: Journos investigate people, parse words. Scientists investigate nature. What is more worthy?
@lippard: @BoraZ I would say that there are instances of investigative journalism that have had more value than some instances of scientific research.
Me: @BoraZ: @lippard possible, but that is investigating the rare instances of misconduct by people, not investigating the natural reality. Science?
@john_s_wilkins: @BoraZ You’re asking this of a profession that thinks it needs to “give the other side” when reporting on science, i.e., quacks
@LeeBillings: @BoraZ Twitter is useful tool, but probably not best way to interview for the story you seem to be after, as responses lack depth and nuance
@LeeBillings: @BoraZ Still looking forward to reading your resulting story, of course
Me: @BoraZ: @LeeBillings you can add longer responses on FriendFeed: http://friendfeed.com/coturnix that’s what it’s for
@1seahorse1: @BoraZ Do you mean that I have to be nostalgic about my ape tribe and life in caves ? 🙂
@TyeArnett: @BoraZ parsing data can be as dangerous as parsing words sometimes
@ccziv: @BoraZ Do not underestimate or devalue the importance of words, ever.
This shows that different people have very different ideas what ‘investigative reporting’ is and have even more difficulty figuring out how that applies to science! Let’s go nice and slow now, explore this a little bit more.
First, I think that what Dave meant in his first tweet –
@davemunger: @BoraZ To me, it means going beyond looking at a single study to really understand a scientific concept. Diff from traditional “inv. journo”
– is not ‘investigative reporting’ but ‘news analysis’ (again, see my attempt at classification), something akin to ‘explainers’ done occasionally by the mainstream media (think of This American Life on NPR and their ‘Giant Pool of Money‘ explainer for a great recent example). It is an equivalent of a Review Article in a scientific journal, but aimed at a broader audience and not assuming existing background knowledge and expertise.
The different worlds of journalists and scientists
This discussion, as well as many similar discussions we had in the past, uncovers some interesting differences between the way journalists and scientists think about ‘investigative’ in the context of reporting.
Journalists, when investigating, investigate people, almost exclusively. Scientists are much more open to including other things under this rubric, as they are interested in investigating the world.
Journalists focus almost entirely on words, i.e., what people say. In other words, they are interested mainly on the process and what the words reveal as to who is winning and who is losing in some imaginary (or sometimes real) game. Scientists are interested in results of the process, obtained by any means, only one of which is through people’s utterances – they are interested in investigating and uncovering the facts.
Journalists display an inordinate amount of skepticism – even deep cynicism – about anyone’s honesty. Everyone’s a liar unless proven not to be. Scientists, knowing themselves, knowing their colleagues, knowing the culture of science where 100% honesty and trust are the key, knowing that exposure of even the tiniest dishonesty is likely The End of a scientific career, tend to trust scientists a great deal more. On the other hand, scientists are deeply suspicious of people who do not abide by high standards of the scientific community, and The List of those who, due to track record, should be mistrusted the most is topped by – journalists.
This explains why scientists generally see Futurity.org as an interesting method of providing scientific information to the public, assuming a priori, knowing the track record of these institutions and what kind of reputation is at stake, that most or all of it will be reliable, while a journalist exclaims “propaganda”.
The Question of Trust
In this light, it is very instructive to read this post by a young science journalist, and the subsequent FriendFeed discussion of it. It is difficult for people outside of science to understand who is “inside” and thus to be trusted and who is not.
Those on the “inside”, the scientists, are already swimming in these waters and know instantly who is to be trusted and who not. Scientists know that Lynn Margolis was outside (untrusted) at first, inside (trusted) later and outside (untrusted) today again. Scientists know that James Lovelock or Deepak Chopra or Rupert Shaldrake are outside, always were and always will be, and are not to be trusted. Journalists can figure this out by asking, but then they need to figure out whose answer to trust! Who is inside and trusted to say who else is inside and trusted? If your first point of entry is the wrong person, all the “sources” you interview will be wackos.
Unfortunately the mistrust by journalists is often ‘schematic’ – not based on experience or on investigating the actual facts. They have a schema in their minds as to who is likely to lie, who is likely to use weaselly language, who can generally be trusted, etc. They use this rule-of-thumb when interviewing criminals, corrupt cops (“liars”), politicians, lawyers, CEOs (“weaselly words”), other journalists (“trustworthy”) and yes, scientists (“suspicious pointy-heads with hard-to-uncover financial motives”).
The automatic use of such “rule” is why so many D.C. reporters (so-called Village) did not understand (and some still do not understand) that someone who is supposed to be in the “use weaselly language” column – the politicians – should actually have been in the “lying whenever they open their mouths” column for eight years of the Bush rule (or, to be fair, the last 30 years). It did not occur to them to fact-check what Republicans said and hastily move them to the appropriate “chronic liars” category and report appropriately. They could not fathom that someone like The President would actually straight-out lie. Every sentence. Every day. Nobody likes being shown to be naive, but nobody likes being lied to either. Their need for appearance of savviness (the opposite of naive), for many of them, over-rode the need to reveal they’ve been lied to and fell for it (“What are you saying? Can’t be possible. They are such nice guys when they pat my back at a cocktail party over in The Old Boys Club Cafe – they wouldn’t lie to me!”). And many in their audience are in the same mindset – finding it impossible (as that takes courage and humility) to admit to themselves that they were so naive they fell for such lies from such high places (both the ruling party and their loyal stenographers). And we all suffered because of it.
The heavy reliance on such rules or mental schemas by journalists is often due to their self-awareness about the lack of knowledge and expertise on the topic they are covering. They just don’t know who to trust, because they are not capable of uncovering the underlying facts and thus figure out for themselves who is telling the truth and who is lying (not to mention that this would require, gasp, work instead of hanging out at cocktail parties). To cover up the ignorance and make it difficult for it to be revealed by the audience, they strongly resist the calls to provide the links to more information and especially to their source documents.
Thus He Said She Said journalism is a great way for them to a) focus on words, people, process and ‘horse-race’ instead of facts, b) hide their ignorance of the underlying facts, c) show their savvy by “making both side angry” which, in some sick twist, they think means they are doing a good job (no, that means all readers saw through you and are disgusted by your unprofessionalism). Nowhere does that show as clearly as when they cover science.
A more systematic investigation into ‘investigation’
Now that I raised everyone’s ire, let me calm down again and try to use this blog post the way bloggers often do – as a way to clarify thoughts through writing. I am no expert on this topic, but I am interested, I read a lot about it, blog about it a lot, and want to hear the responses in the comments. Let me try to systematize what I think ‘investigative reporting’ is in general and then apply that to three specific cases: 1) a scientist investigating nature and reporting about it in a journal, 2) a journalist investigating scientists and their work and reporting about it in a media outlet, and 3) a science blogger investigating the first two and reporting how good or bad job each one of them did.
A few months ago, I defined ‘investigative journalism’ like this:
Investigative reporting is uncovering data and information that does not want to be uncovered.
Let’s see how that works in practice.
Steps in Investigative Reporting:
1) Someone gets a hunch, wiff, a tip from someone or an intuition (or orders from the boss to take a look) that some information exists that is hidden from the public.
2) That someone then uses a whole suit of methods to discover that secret information, often against the agents that resist the idea of that information becoming available to the public.
3) That someone then puts all of the gathered information in one place and looks for patterns, overarching themes, connections and figures out what it all means.
4) That someone then writes an article, with a specific audience in mind, showing to the public the previously secret information (often including all of it – the entire raw data sets or documents or transcripts) and explaining what it means.
5) That someone then sends the article to the proper venue where it undergoes an editorial process.
6) If accepted for publication, the article gets published.
7) The article gets a life of its own – people read (or listen/view) it, comment, give feedback, or follow up with investigation digging up more information that is still not public (so the cycle repeats).
Case I: Scientist
1) Someone gets a hunch, wiff, a tip from someone or an intuition (or orders from the boss to take a look) that some information exists that is hidden from the public.
The keeper of the secret information is Nature herself. The researcher can get a hunch about the existence of hidden information in several different ways:
– delving deep into the literature, it becomes apparent that there are holes – missing information that nobody reported on yet, suggesting that nobody uncovered it yet.
– doing research and getting unexpected results points one to the fact that there is missing information needed to explain those funky results.
– going out into nature and observing something that, upon digging through the literature, one finds has not been explained yet.
– getting a photocopy of descriptions of three experiments from the last grant proposal from your PI with the message “Do this”. Great method for introducing high school and undergraduate students into research, and perhaps to get a brand new Masters student started (of course, regular discussions of the progress are needed). Unfortunately, some PIs continue doing this to their PhD students and even postdocs, instead of giving them freedom of creativity.
2) That someone then uses a whole suit of methods to discover that secret information, often against the agents that resist the idea of that information becoming available to the public.
The scientific method includes a variety of methods for wresting secret information out of Nature: observations, experiments, brute-force Big Science, natural experiments, statistics, mathematical modeling, etc. It is not easy to get this information from Nature as she resists. One has to be creative and innovative in designing tricks to get reliable data from her.
3) That someone then puts all of the gathered information in one place and looks for patterns, overarching themes, connections and figures out what it all means.
All the collected data from a series of observations/experiments are put together, statistically analyzed, visualized (which sometimes leads to additional statistical analyses as visualization may point out phenomena not readily gleaned from raw numbers) and a common theme emerges (if it doesn’t – more work needs to be done).
4) That someone then writes an article, with a specific audience in mind, showing to the public the previously secret information (often including all of it – the entire raw data sets or documents or transcripts) and explaining what it means.
There are three potential audiences for the findings of the research: experts in one’s field, other scientists, and lay audience (which may include policy-makers or political-action organizations, or journalists, or teachers, or physicians, etc.).
The experts in one’s field are the most important audience for most of research. The proper venue to publish for this audience is a scientific journal of a narrow scope (usually a society journal) that is read by all the experts in the same field. The article can be dense, using the technical lingo, containing all the information needed for replication and further testing of the information and should, in principle, contain all the raw data.
The scientific community as a whole as the target audience is somewhat baffling – on one hand, some of them are also experts in the field, on the other hand, all the rest are essentially lay audience. It is neither-nor. Why target scientific community as an audience then? Because the venue for this are GlamourMagz and publishing in these is good for one’s career and fame. The format in which such papers are written is great for scientists in non-related disciplines – it tells a story, but it is extremely frustrating for same-field researchers as there is not sufficient detail (or data) to replicate, re-test or follow-up on the described research. Publishing this way makes you known to a lot more scientists, but tends to alienate your closests colleagues who are frustrated by the lack of information in your report.
The lay audience is an important audience for some types of research – ones that impact people’s personal decisions about their health or about taking care of the environment, ones that can have impact on policy, ones that are useful to know by health care providers or science educators, or ones that are so cool (e.g., new fossils of dinosaurs or, erm…Ida) that they will excite the public about science.
Many scientists are excellent and exciting communicators and can speak directly to the audience (online on blogs/podcasts/videos or offline in public lectures or science cafes), or will gladly accept to do interviews (TV, radio, newspapers, magazines) about their findings. Those researchers who know they are not exciting communicators, or do not like to be in public, or are too busy, or have been burned by the previous interactions with the media, tend to leave the communication to lay audience to professionals – the press officers at their institutions.
While we have all screamed every now and then at some blatantly bad press releases (especially the titles imposed by the editors), there has been generally a steady, gradual improvement in their quality over the years. One of the possible explanations for this is that scientists that fall out of the pipeline as there are now so many PhDs and so few academic jobs, have started replacing English majors and j-school majors in these positions. More and more institutions now have science-trained press officers who actually understand what they are writing about. Thus, there is less hype yet more and better explanation of the results of scientific investigation. Of course, they tend to be excellent writers as well, a talent that comes with love and practice and does not necessitate a degree in English or Communications.
5) That someone then sends the article to the proper venue where it undergoes an editorial process.
The first draft of the article is usually co-written and co-edited by a number of co-authors who “peer-review” each other during the process. That draft is then (2nd peer-review) usually given to other lab-members, collaborators, friends and colleagues to elicit their opinion. Their feedback is incorporated into the improved draft which is then sent to the appropriate scientific journal where the editor sends the manuscript to anywhere between one and several experts in the field, usually kept anonymous, for the 3rd (and “official”) peer-review. This may then go through two or three cycles before the reviewers are satisfied with the edits and changes and recommend to the editor that the paper be published (or not, in which case the whole process gets repeated at lesser and lesser and lesser journals…until the paper is either finally published or abandoned or self-published on a website).
6) If accepted for publication, the article gets published.
Champaign time!
Then, next morning, back to the lab – trying to uncover more information.
7) The article gets a life of its own – people read (or listen/view) it, comment, give feedback, or follow up with investigation digging up more information that is still not public (so the cycle repeats).
After Nature closely guarded her secrets for billions of years, and after intrepid investigators snatched the secret information from her over weeks, months, years or decades of hard and creative work, the information is finally made public. The publication date is the date of birth for that information, the moment when its life begins. Nobody can predict what kind of life it will have at that point. It takes years to watch it grow and develop and mature and spawn.
People download it and read it, think about it, talk about it, interact with it, blog about it and, most importantly, try to replicate, re-test and follow up on the information in order to uncover even more information.
If that is not ‘investigative reporting’ at its best, I don’t know what is.
Case II: Science Journalist
1) Someone gets a hunch, wiff, a tip from someone or an intuition (or orders from the boss to take a look) that some information exists that is hidden from the public.
The hidden information, in this case, is most likely to be man-made information – documents, human actions, human words. It is especially deemed worthy of investigation if some wrong-doing is suspected.
2) That someone then uses a whole suit of methods to discover that secret information, often against the agents that resist the idea of that information becoming available to the public.
As the journalist cannot “go direct” and investigate nature directly (not having the relevant training, expertise, infrastructure, funding, manpower, equipment, etc.), the only remaining method is to investigate indirectly. The usual indirect method for journalists is to ask people – a very, very, very unreliable way of getting information.
Since investigating the facts about nature is outside the scope of expertise of journalists, they usually investigate the behavior and conduct of scientists. This is “investigative meta-science reporting”. In a sense, there is not much difference between investigating potential misconduct of scientists and misconduct of any other group of people. The main difference is that the business of science is facts about the way the world works, thus knowing who got the facts right and who got the facts wrong is important and who misrepresents lies as facts is even more important.
Unfortunately, due to lack of scientific expertise, journalists find this kind of investigation very difficult – they have to rely on the statements of scientists as to the veracity of other scientists’ facts or claims – something they are not in position to verify directly. If they ask the wrong person – a quack, for example – they will follow all the wrong leads.
Thus, the usual fall-back is HeSaidSheSaid model of journalism, reporting who said what, not committing to any side, not evaluating truth-claims of any side, and hoping that (also science-uneducated) audience will be able to figure it out for itself.
Since they cannot evaluate the truth-claims about Nature that scientists make, journalists have to use proxy mechanisms to uncover misconduct, e.g., discover other unseemly behaviors by the same actors, unrelated to the research itself. Thus discovering instances of lying, or financial ties, is the only way a journalist can start guessing as to who can be trusted, and then hope that the person who lies about his/her finances is also lying about facts about Nature – a correlation that is hard to prove and is actually quite unlikely except in rare instances of industry/lobby scientists-for-hire.
The actual research misconduct – fudging data, plagiarism, etc – can be uncovered only by other scientists. And they do it whenever they suspect it, and they report the findings in various ways. The traditional method of sending a letter to the editor of the journal that published the suspect paper is so ridiculously difficult that many are now pursuing other venues, be it by notifying a journalist, or going direct, on a blog, or, if the journal is enlightened (COI – see my Profile), by posting comments on the paper itself.
3) That someone then puts all of the gathered information in one place and looks for patterns, overarching themes, connections and figures out what it all means.
Once all the information is gathered in one place, any intelligent person can find patterns. Scientific expertise is not usually necessary for this step. Thus, once the journalists manages to gather all the information (the hard part), he/she is perfectly capable of figuring out the story (the easy part).
4) That someone then writes an article, with a specific audience in mind, showing to the public the previously secret information (often including all of it – the entire raw data sets or documents or transcripts) and explaining what it means.
Journalist advantage – they tend to be good with language and writing a gripping story. If the underlying information is correct, and the conclusions are clear, and the journalist is not afraid to state clearly who is telling the truth and who is lying, the article should be good.
5) That someone then sends the article to the proper venue where it undergoes an editorial process.
The editor who comes up with titles usually screws up this step. Otherwise, especially if nobody cuts out important parts due to length limits, the article should be fine. Hopefully, the venue targets the relevant audience – either experts (who can then police their own) or general public (who can elicit pressure on powers-that-be).
6) If accepted for publication, the article gets published.
Deadline for the next story looms. Back to the grind.
7) The article gets a life of its own – people read (or listen/view) it, comment, give feedback, or follow up with investigation digging up more information that is still not public (so the cycle repeats).
Now that the information is public, people can spread it around (e-mailing to each other, linking to it on their blogs, social networks, etc.). They bring in their own knowledge and expertise and provide feedback in various venues and some are motivated to follow up and dig deeper, perhaps uncovering more information (so the cycle repeats).
Most of science journalism is, thus, not investigative journalism. Most of it is simple reporting of the findings, i.e., second-hand reporting of the investigative reporting done by scientists (Case I). Or, as science reporters are made so busy by their editors, forced to write story after story in rapid succession, stories about many different areas of science, most science reporting in the media is actually third-hand reporting: first-hand was by scientists in journals, second-hand was by press officers of the institutions, and the journalist mainly regurgitates the press releases. As in every game of Broken Telephones/Chinese Whispers , the first reporter is more reliable then the second one in line who is more dependable than the third one and so on. Thus a scientist “going direct” is likely to give a much more reliable account of the findings than the journalist reporting on it.
There are exceptions, of course. Each discussion of science journalism always brings out commenters who shout the names of well-known and highly respectes science journalists. The thing is, those people are not science reporters. They are science journalists only in the sense that ‘Science Writers’ is a subset of the set ‘Science Journalists’. This is a subset that is very much in a privileged position – they are given freedom to write what, when, where and how they want. Thus, over many years, they develop their own expertise.
Carl Zimmer has, over the years, read so many papers, talked to so many experts, and written so many books, articles and blogposts, that he probably knows more about evolution, parasites and E.coli than biology PhDs whose focus is on other areas of biology. Eric Roston probably knows more about carbon than many chemistry PhDs. These guys are experts. And they are writers, not reporters. They do not get assignments to write many stories per week on different areas of science. They are not who I am talking about in this post at all.
Do they do investigative reporting? Sometimes they do, but they chose other venues for it. When George Will lied about climate change data in a couple of op-eds, Carl Zimmer used his blog, not the NYTimes Science section, to dig and expose the facts about the industry and political influences, about George Will’s history on the issue, about cowardly response by Washington Post to the uncovering of these unpleasant facts, etc.
Rebecca Skloot did investigative journalism as well, over many years, and decided to publish the findings in a form of a book, not in a newspaper or magazine. That is not the work of a beat reporter.
Case III: Science Blogger
1) Someone gets a hunch, wiff, a tip from someone or an intuition (or orders from the boss to take a look) that some information exists that is hidden from the public.
Bloggers are often looking for blogging materials from two distinctly different sources: the Tables of Content of scientific journals in the fields they have expertise in, and services that serve press releases (e.g., EurekAlert, ScienceDaily, etc.). They are also usually quite attuned to the mass media, i.e., they get their news online from many sources instead of reading just the local paper.
What many bloggers do and are especially good at doing is comparing the work of Case I and Case II investigative reporters. They can access and read and understand the scientific paper and directly compare it to the press releases and the media coverage (including the writings by other bloggers). Having the needed scientific expertise, they can evaluate all the sources and make a judgment on their quality.
Sometimes the research in the paper is shoddy but the media does not realize it and presents it as trustworthy. Sometimes the paper is good, but the media gets it wrong (usually in a sensationalist kind of way). Sometimes both the paper and the media get it right (which is not very exciting to blog about).
2) That someone then uses a whole suit of methods to discover that secret information, often against the agents that resist the idea of that information becoming available to the public.
Replicating experiments and putting that on the blog is rare (but has been done). But digging through the published data and comparing that to media reports is easy when one has the necessary expertise. Consulting with colleagues, on the rare occasions when needed, is usually done privately via e-mail or publicly on places like FriendFeed or Twitter, and there is no need to include quotes in the blog post itself.
Bloggers have done investigative digging in a journalistic sense as well – uncovering unseemly behavior of people. I have gathered a few examples of investigative reporting by science bloggers before:
Whose investigative reporting led to resignation of Deutch, the Bush’s NASA censor? Nick Anthis, a (then) small blogger (who also later reported on the Animal Rights demonstrations and counter-demonstrations in Oxford in great detail as well).
Who blew up the case of plagiarism in dinosaur palaenthology, the so-calles Aetogate? A bunch of bloggers.
Who blew up, skewered and castrated the PRISM, the astroturf organization designed to lobby the Senate against the NIH Open Access bill? A bunch of bloggers. The bill passed.
Remember the Tripoli 6?
Who pounced on George Will and WaPo when he trotted out the long-debunked lie about global warming? And forced them to squirm, and respond, and publish two counter-editorials? A bunch of bloggers.
Who dug up all the information, including the most incriminating key evidence against Creationists that was used at the Dover trial? A bunch of bloggers.
And so on, and so on, this was just scratching the surface with the most famous stories.
3) That someone then puts all of the gathered information in one place and looks for patterns, overarching themes, connections and figures out what it all means.
This is often a collective effort of multiple bloggers.
4) That someone then writes an article, with a specific audience in mind, showing to the public the previously secret information (often including all of it – the entire raw data sets or documents or transcripts) and explaining what it means.
The target audience of most science blogs is lay audience, but many of the readers are themselves scientists as well.
5) That someone then sends the article to the proper venue where it undergoes an editorial process.
Most blogs are self-edited. Sending a particularly ‘hot’ blog post to a couple of other bloggers asking their opinion before it is posted is something that a blogger may occasionally do.
6) If accepted for publication, the article gets published.
Click “Post”. That easy.
7) The article gets a life of its own – people read (or listen/view) it, comment, give feedback, or follow up with investigation digging up more information that is still not public (so the cycle repeats).
Feedback in comments usually comes in really fast! It is direct, straightforward and does not follow the usual formal kabuki dance that ensures the control and hierarchy remains intact in more official venues.
Other bloggers may respond on their own blogs (especially if they disagree) or spread the link on social networks (especially if they agree).
If many bloggers raise hell about some misconduct and persist in it over a prolonged periods of time, this sometimes forces the corporate media to pick up the hot-potato story despite the initial reluctance to do so. But this applies to all investigative reporting on blogs, not just science.
Also, bloggers are not bound by 20th century journalistic rules – thus the exposure by impersonation, what the conservative activists did to ACORN, is perfectly legitimate way of uncovering dirt in informal venues, but not legit in corporate venues.
One more point that needs to be made here. Different areas of science are different!
Biomedical science is a special case. It is huge. It has huge funding compared to other areas, yet not sufficient to feed the armies of researchers involved in it. It attracts the self-aggrandizing type disproportionately. Much is at stake: patents, contracts with pharmaceutical industry, money, fame, Nobel prizes… Thus it is extremely competitive. It also uses laboratory techniques that are universal and fast, thus it is easy to scoop and get scooped, which fosters the culture of secrecy. It suffers from CNS disease (necessity to publish in GlamourMagz like Cell, Nature and Science). It gets inordinate proportion of media (and blog) attention due to relevance to human health. All those pressures make the motivation to fudge data too strong for some of the people involved – very few, for sure, out of 10,000s involved.
On the other end of the spectrum is, for example, palaeontology. Very few people can be palaeontologists – not enough positions and not enough money. There is near-zero risk of getting scooped as everyone knows who dug what out, where, during which digging season (Aetogate, linked above, was a special case of a person using a position of power to mainly scoop powerless students). Your fossil is yours. The resources are extremely limited and so much depends on luck. Discovering a cool fossil is not easy and if you get your hands on one, you have to milk it for all it’s worth. You will publish not one but a series of papers. First paper is a brief announcement of the finding with a superficial description, the second is a detailed description, the third is the phylogenetic analysis, the fourth focuses on one part of the fossil that can say something new about evolution, etc. And you hope that all of this will become well-known to the general public. The palaeo community is so small, they all already know. They will quibble forever with you over the methodology and conclusions (so many assumptions have to go into methods that analyze old, broken bones). It is the lay audience that needs to be reached, by any means necessary. Many paleontologists don’t even work as university professors but are associated with museums, nature magazines, or are freelancing. The pressure to publish in GlamourMagz is there only as a means to get the attention of the media, not to impress colleagues or rise in careers.
Most of science and most scientists, on the other hand, do not belong to one of these two fields and do not work at high-pressure universities. They do science out of their own curiosity, feel no pressure to publish a lot or in GlamourMagz, do not fear scooping, are open and relaxed and have no motivation to fudge data or plagiarize. They know that the reputation with their peers – the only reputation they can hope to get – is dependent entirely on immaculate work and behavior. Why keep them suspect because two media-prominent sub-sub-disciplines sometimes produce less-than-honest behavior? Why not trust that their papers are good, their press releases correct, their blogging honest, and their personal behavior impeccable? I’d say they are presumed innocent unless proven guilty, not the other way around.
I’d like to see an equivalent of Futurity.org for state universities and small colleges. What a delightful source of cool science that would be!
Update: blogging at its best. After a couple of hit-and-run curmudgeounly comments posted early on, this post started receiving some very thoughtful and useful comments (e.g., especially one by David Dobbs) that are edifying and are helping me learn – which is the point of blogging in the first place, isn’t it?
Like this:
Like Loading...