Category Archives: Science Practice

What is the real purpose of a graduate education in science? (video)

The new issue of Journal of Science Communication is now published

The new issue of Journal of Science Communication is now online (Open Access, so you can download all PDFs for free). Apart from the article on blogging that we already dissected at length, this issue has a number of interesting articles, reviews, perspectives and papers:
Users and peers. From citizen science to P2P science:

This introduction presents the essays belonging to the JCOM special issue on User-led and peer-to-peer science. It also draws a first map of the main problems we need to investigate when we face this new and emerging phenomenon. Web tools are enacting and facilitating new ways for lay people to interact with scientists or to cooperate with each other, but cultural and political changes are also at play. What happens to expertise, knowledge production and relations between scientific institutions and society when lay people or non-scientists go online and engage in scientific activities? From science blogging and social networks to garage biology and open tools for user-led research, P2P science challenges many assumptions about public participation in scientific knowledge production. And it calls for a radical and perhaps new kind of openness of scientific practices towards society.

Changing the meaning of peer-to-peer? Exploring online comment spaces as sites of negotiated expertise:

This study examines the nature of peer-to-peer interactions in public online comment spaces. From a theoretical perspective of boundary-work and expertise, the comments posted in response to three health sciences news articles from a national newspaper are explored to determine whether both scientific and personal expertise are recognized and taken up in discussion. Posts were analysed for both explicit claims to expertise and implicit claims embedded in discourse. The analysis suggests that while both scientific and personal expertise are proffered by commenters, it is scientific expertise that is privileged. Those expressing scientific expertise receive greater recognition of the value of their posts. Contributors seeking to share personal expertise are found to engage in scientisation to position themselves as worthwhile experts. Findings suggest that despite the possibilities afforded by online comments for a broader vision of what peer-to-peer interaction means, this possibility is not realized.

The public production and sharing of medical information. An Australian perspective:

There is a wealth of medical information now available to the public through various sources that are not necessarily controlled by medical or healthcare professionals. In Australia there has been a strong movement in the health consumer arena of consumer-led sharing and production of medical information and in healthcare decision-making. This has led to empowerment of the public as well as increased knowledge-sharing. There are some successful initiatives and strategies on consumer- and public-led sharing of medical information, including the formation of specialised consumer groups, independent medical information organisations, consumer peer tutoring, and email lists and consumer networking events. With well-organised public initiatives and networks, there tends to be fairly balanced information being shared. However, there needs to be caution about the use of publicly available scientific information to further the agenda of special-interest groups and lobbying groups to advance often biased and unproven opinions or for scaremongering. With the adoption of more accountability of medical research, and the increased public scrutiny of private and public research, the validity and quality of medical information reaching the public is achieving higher standards.

Social network science: pedagogy, dialogue, deliberation:

The online world constitutes an ever-expanding store and incubator for scientific information. It is also a social space where forms of creative interaction engender new ways of approaching science. Critically, the web is not only a repository of knowledge but a means with which to experience, interact and even supplement this bank. Social Network Sites are a key feature of such activity. This paper explores the potential for Social Network Sites (SNS) as an innovative pedagogical tool that precipitate the ‘incidental learner’. I suggest that these online spaces, characterised by informality, open-access, user input and widespread popularity, offer a potentially indispensable means of furthering the public understanding of science; and significantly one that is rooted in dialogue.

Open science: policy implications for the evolving phenomenon of user-led scientific innovation:

From contributions of astronomy data and DNA sequences to disease treatment research, scientific activity by non-scientists is a real and emergent phenomenon, and raising policy questions. This involvement in science can be understood as an issue of access to publications, code, and data that facilitates public engagement in the research process, thus appropriate policy to support the associated welfare enhancing benefits is essential. Current legal barriers to citizen participation can be alleviated by scientists’ use of the “Reproducible Research Standard,” thus making the literature, data, and code associated with scientific results accessible. The enterprise of science is undergoing deep and fundamental changes, particularly in how scientists obtain results and share their work: the promise of open research dissemination held by the Internet is gradually being fulfilled by scientists. Contributions to science from beyond the ivory tower are forcing a rethinking of traditional models of knowledge generation, evaluation, and communication. The notion of a scientific “peer” is blurred with the advent of lay contributions to science raising questions regarding the concepts of peer-review and recognition. New collaborative models are emerging around both open scientific software and the generation of scientific discoveries that bear a similarity to open innovation models in other settings. Public engagement in science can be understood as an issue of access to knowledge for public involvement in the research process, facilitated by appropriate policy to support the welfare enhancing benefits deriving from citizen-science.

Googling your genes: personal genomics and the discourse of citizen bioscience in the network age:

In this essay, I argue that the rise of personal genomics is technologically, economically, and most importantly, discursively tied to the rise of network subjectivity, an imperative of which is an understanding of self as always already a subject in the network. I illustrate how personal genomics takes full advantage of social media technology and network subjectivity to advertise a new way of doing research that emphasizes collaboration between researchers and its members. Sharing one’s genetic information is considered to be an act of citizenship, precisely because it is good for the network. Here members are encouraged to think of themselves as dividuals, or nodes, in the network and their actions acquire value based on that imperative. Therefore, citizen bioscience is intricately tied, both in discourse and practices, to the growth of the network in the age of new media.

Special issue on peer-to-peer and user-led science: invited comments:

In this commentary, we collected three essays from authors coming from different perspectives. They analyse the problem of power, participation and cooperation in projects of production of scientific knowledge held by users or peers: persons who do not belong to the institutionalised scientific community. These contributions are intended to give a more political and critical point of view on the themes developed and analysed in the research articles of this JCOM special issue on Peer-to-peer and user-led science.
Michel Bauwens, Christopher Kelty and Mathieu O’Neil write about different aspects of P2P science. Nevertheless, the three worlds they delve into share the “aggressively active” attitude of the citizens who inhabit them. Those citizens claim to be part of the scientific process, and they use practices as heterogeneous as online peer-production of scientific knowledge, garage biology practiced with a hacker twist, or the crowdsourced creation of an encyclopedia page. All these claims and practices point to a problem in the current distribution of power. The relations between experts and non-experts are challenged by the rise of peer-to-peer science. Furthermore, the horizontal communities which live inside and outside the Net are not frictionless. Within peer-production mechanisms, the balance of power is an important issue which has to be carefully taken into account.

Is there something like a peer to peer science?:

How will peer to peer infrastructures, and the underlying intersubjective and ethical relational model that is implied by it, affect scientific practice? Are peer-to-peer forms of cooperation, based on open and free input of voluntary contributors, participatory processes of governance, and universal availability of the output, more productive than centralized alternatives? In this short introduction, Michel Bauwens reviews a number of open and free, participatory and commons oriented practices that are emerging in scientific research and practice, but which ultimately point to a more profound epistemological revolution linked to increased participatory consciousness between the scientist and his human, organic and inorganic research material.

Outlaw, hackers, victorian amateurs: diagnosing public participation in the life sciences today:

This essay reflects on three figures that can be used to make sense of the changing nature of public participation in the life sciences today: outlaws, hackers and Victorian gentlemen. Occasioned by a symposium held at UCLA (Outlaw Biology: Public Participation in the Age of Big Bio), the essay introduces several different modes of participation (DIY Bio, Bio Art, At home clinical genetics, patient advocacy and others) and makes three points: 1) that public participation is first a problem of legitimacy, not legality or safety; 2) that public participation is itself enabled by and thrives on the infrastructure of mainstream biology; and 3) that we need a new set of concepts (other than inside/outside) for describing the nature of public participation in biological research and innovation today.

Shirky and Sanger, or the costs of crowdsourcing:

Online knowledge production sites do not rely on isolated experts but on collaborative processes, on the wisdom of the group or “crowd”. Some authors have argued that it is possible to combine traditional or credentialled expertise with collective production; others believe that traditional expertise’s focus on correctness has been superseded by the affordances of digital networking, such as re-use and verifiability. This paper examines the costs of two kinds of “crowdsourced” encyclopedic projects: Citizendium, based on the work of credentialled and identified experts, faces a recruitment deficit; in contrast Wikipedia has proved wildly popular, but anti-credentialism and anonymity result in uncertainty, irresponsibility, the development of cliques and the growing importance of pseudo-legal competencies for conflict resolution. Finally the paper reflects on the wider social implications of focusing on what experts are rather than on what they are for.

The unsustainable Makers:

The Makers is the latest novel of the American science fiction writer, blogger and Silicon Valley intellectual Cory Doctorow. Set in the 2010s, the novel describes the possible impact of the present trend towards the migration of modes of production and organization that have emerged online into the sphere of material production. Called New Work, this movement is indebted to a new maker culture that attracts people into a kind of neo-artisan, high tech mode of production. The question is: can a corporate-funded New Work movement be sustainable? Doctorow seems to suggest that a capitalist economy of abundance is unsustainable because it tends to restrict the reach of its value flows to a privileged managerial elite.

Aves 3D

Aves 3D is a ‘three dimensional database of avian skeletal morphology’ and it is awesome!
Aves3D logo.pngThis is an NSF-funded project led by Leon Claessens, Scott Edwards and Abby Drake. What they are doing is making surface scans of various bones of different bird species and placing the 3D scans on the website for everyone to see and use. With simple use of the mouse or arrow buttons, one can move, zoom and rotate each image any way one wants.
The collection is growing steadily and already contains some very interesting bones from a number of species, both extinct and extant. You can see examples of bones of the dodo or the Diatryma gigantea (aka Gaston’s Bird), as well as many skulls and sternums and various limb bones of currently existing species.
The database is searchable by
Cladogram, Scientific Name, Common Name, Skeletal Element, geological era, Geographical Location or Specimen Number.
Most of the actual scanning is done by undergraduate students and the database is already being use for several scientific projects. You can get involved and help build the database, you can use the scans for teaching and research, or you can just go and have fun rotating the cool-looking bird bones.

Frontiers of Knowledge Award goes to Robert J. Lefkowitz for G-protein coupled receptors

I had a good fortune to hear Dr. Lefkowitz speak once. Great guy. From the press release:

The prestigious BBVA Foundation Frontiers of Knowledge Award in the Biomedicine category goes this year to Robert J. Lefkowitz, MD, James B. Duke Professor of Medicine and Biochemistry and a Howard Hughes Medical Institute (HHMI) investigator at Duke University Medical Center.
This is only the second year the award has been given.
Dr. Lefkowitz’s research has affected millions of cardiac and other patients worldwide. Lefkowitz proved the existence of, isolated, characterized and still studies G-protein-coupled receptors (GPCRs).
The receptors, which are located on the surface of the membranes that surround cells, are the targets of almost half of the drugs on the market today, including beta blockers for heart disease, antihistamines and ulcer medications.
Lefkowitz, a Duke faculty member since 1973, also investigates related enzymes, proteins, and signaling pathways and continues to learn all he can about these pivotal receptors.
“I am surprised, delighted and honored by the award, and am honored to be in the company of Joan MassaguĂ©, a fellow HHMI investigator who won last year,” said Lefkowitz, who is also a Duke professor of immunology and a basic research cardiologist in the Duke Heart Center.
“While it is a relatively new award, I know it is a very distinguished award, and I am delighted to be the recipient.”
The BBVA Frontiers of Knowledge Award in Biomedicine provides the winner a cash prize of 400,000 euros (about $563,400). The award, organized by the BBVA Foundation in partnership with Spain’s National Research Council, was announced at 11 a.m. Madrid time on Jan. 27.
Dr. Lefkowitz is being awarded the prize for the work he has done since the beginning of his career and includes his ongoing studies of GPCRs and other key receptors.
His research group first identified, purified, and cloned the genes for these receptors in the 1970s and 1980s, and revealed the structure of the receptors as well as their functions and regulation. This work facilitated and fundamentally altered the way in which numerous therapeutic agents have been developed.
Lefkowitz is also extremely proud of his mentoring work and of the students and fellows he has worked with over the years, many of whom have gone on to run successful laboratories and uncover their own discoveries about GPCRs and other receptors.
The Biomedicine Award honors contributions that significantly advance the stock of knowledge in the biomedicine field because of their importance and originality.
The BBVA Foundation Frontiers of Knowledge Awards seek to recognize and encourage world-class research at the international level, and are similar to the Nobel Prizes, with an annual total of 3.2 million euros given to deserving winners, because of the breadth of the scientific and artistic areas they have covered during their careers.

What is ‘Investigative Science Journalism’?

Background
When Futurity.org, a new science news service, was launched last week, there was quite a lot of reaction online.
Some greeted it with approval, others with a “wait and see” attitude.
Some disliked the elitism, as the site is limited only to the self-proclaimed “top” universities (although it is possible that research in such places, where people are likely to be well funded, may be the least creative).
But one person – notably, a journalist – exclaimed on Twitter: “propaganda!”, which led to a discussion that revealed the journalist’s notion that press releases are automatically suspect and scientists are never to be trusted and their institutions even less. That was a very anti-science sentiment from a professional science journalist, some of us thought.
This exchange reminded me of a number of prior debates between the traditional Old Media journalists and the modern New Media journalists about the very definition of ‘journalism’. The traditional journalists are fighting to redefine it in a narrowest possible way that keeps them in a position of gatekeepers (like the new proposed shield law that defines a journalist as someone who gets paid by the Old Media organization, thus NOT protecting citizen journalists, accidental journalists, bloggers, etc.), while the new ones are observing the way the world is changing and trying to come up with new definitions that better reflect the world (and often go too far in the other direction – defining everything broadcast by anyone via any medium to the audience consisting of more than one person as journalism, including the crossword puzzle in a newspaper and the silliest YouTube video).
One of the frequently heard retorts in the “you’ll miss us when we’re gone” genre of defensiveness by the old guard is the slight-of-hand in which they suddenly, in mid-stream of the discussion, redefine journalism to equate only investigative journalism. This usually comes up in the form of “who will report from the school board meetings” question (to which the obvious answer is: “actually, the bloggers are already doing it a lot as the old media has quit decades ago”).
Of course, investigative journalism is just one of many forms under the rubric of ‘journalism’. And, if you actually go and buy a copy of your local newspaper today (it still exists in some places, on tree-derived paper, believe me), you are likely to find exactly zero examples of investigative journalism in it. Tomorrow – the same. Every now and then one appears in the paper, and then it is often well done, but the occasions are rare and getting even more rare as investigative reporters have been cut from many a newsroom over the past few decades, and even more rapidly over the last several months.
So, what is ‘Investigative Science Journalism’?
So, this train of thought brought me to the question, again, of what is ‘investigative journalism’ in science. And I was not perfectly happy with what I wrote about this question before. I had to think some more. But before doing all the thinking myself, I thought I’d try to see what others think. So I tweeted the question in several different ways and got a lot of interesting responses:

Me: What is, exactly, ‘investigative science reporting’?

@davemunger: @BoraZ To me, it means going beyond looking at a single study to really understand a scientific concept. Diff from traditional “inv. journo”

@szvan: @davemunger @BoraZ And looking at methodology, statistical analysis, etc. to determine whether claims made match what was studied.

@LeeBillings: @BoraZ Re: “investigative science reporting,” isn’t it like all other investigative reporting where you dig deep and challenge your sources?

@Melhi: @BoraZ I thnk it means, “we cut/pasted from Wiki, all by ourselves.” Seems to be what it means when “scientific” is removed from the term.

Me: @LeeBillings clarify: What’s the story about? dig deep into what? who are the sources? why are you assuming they need to be challenged?

@soychemist: @BoraZ Any instance in which a reporter tries to uncover scientific information that has been concealed or distorted, using rigorous methods

@john_s_wilkins: @BoraZ Reporting on investigative science, no doubt.

@LeeBillings: @BoraZ ?s you’re asking only make sense in context of a specific story, not in context of defining “sci investigative journalism” as a whole

@LeeBillings: @BoraZ 1/2 but typically, the goal is to find out what’s true, and communicate it. you dig into primary literature & interview tons of ppl

@LeeBillings: @BoraZ 2/2 you don’t assume they need to challenged. you *know* they need to be challenged based on your in-depth research into primary lit

Me: When futurity.org was released, a journo yelled “propaganda”! Does every press release need to be investigated? Challenged?

Me: Are scientists presumed to be liars unless proven otherwise? All of them?

@NerdyChristie: Usually. Unless you’re studying how herbal tea makes you a supergod. RT @BoraZ: Are scientists presumed to be liars unless proven otherwise?

@szvan: @BoraZ Not liars but not inherently less open to bias than anyone else. Some wrongs are lies. Some are errors.

Me: Are journalists capable of uncovering scientific misconduct at all? All of those were uncovered by other scientists, I recall…

@lippard: @BoraZ Didn’t journalist Brian Deer do the investigative work to expose Andrew Wakefield’s MMR-autism data manipulation?

@JATetro: @BoraZ To be honest, there are some very good journalists out there who can spot misconduct but without backing from a source, it’s liable.

Me: @BoraZ: @JATetro yes, they need scientists to do the actual investigating, then report on what scientists discovered – fraud, plagiarism etc.

@JATetro: @BoraZ So it’s not the journalists fault, really. They do their job as well as possible but without our help, there’s little they can do.

@LabSpaces: @JATetro @BoraZ Actual scientists cost too much.They’re a luxury, and especially in these times, it’s hard for pubs. to justify having 1

@JATetro: @LabSpaces @BoraZ Apparently it’s hard for universities to have them as well…not a prof or anything but damn it’s ridiculous.

@LabSpaces: @JATetro @BoraZ I dunno, our PR dept. does a great job interacting with scientists and getting the right info out, but I guess that’s diff.

@JATetro: @LabSpaces @BoraZ Oh, the media people at the U are great. It’s the administrators that seem to forget who keep the students comin’.

Me: Isn’t investigating nature, via experimentation, and publishing the findings in a journal = scientific investigative reporting?

@LeeBillings: @BoraZ 1/2 I’d say that’s performing peer-reviewed scientific research, not doing investigative science journalism.

@LeeBillings: @BoraZ 2/2 No room to address your ?-torrent. What are you driving at, anyway? You think sci journos can’t/don’t do investigative stuff?

@LouiseJJohnson: RT @BoraZ Isn’t investigating nature, via experimentation, & publishing findings in a journal, scientific investigative reporting?

@mcmoots: @BoraZ “Journalism” usually means you report the results of your investigations to the public; scientists report to a technical community.

Me: @BoraZ: @mcmoots does the size and expertise of audience determine what is journalism, what is not? Is it changing these days?

Me: @BoraZ: Why is investigating words called ‘investigative journalism’, but investigating reality, with much more rigorous methods, is not?

@LeeBillings: @BoraZ 1 more thing: A press release isn’t a story–it should inspire journos to look deeper. Sometimes that deeper look reveals PR to be BS

Me: @BoraZ: @LeeBillings Journal article is reporting findings of investigation. Press release is 2ndary. Journo article is 3tiary. Each diff audience.

@LeeBillings: @BoraZ Glad you raised ? of audience, since relevant to yr ? of “words” & “reality.” Words make reality for audiences, some more than others

Me: @BoraZ: Journos investigate people, parse words. Scientists investigate nature. What is more worthy?

@lippard: @BoraZ I would say that there are instances of investigative journalism that have had more value than some instances of scientific research.

Me: @BoraZ: @lippard possible, but that is investigating the rare instances of misconduct by people, not investigating the natural reality. Science?

@john_s_wilkins: @BoraZ You’re asking this of a profession that thinks it needs to “give the other side” when reporting on science, i.e., quacks

@LeeBillings: @BoraZ Twitter is useful tool, but probably not best way to interview for the story you seem to be after, as responses lack depth and nuance

@LeeBillings: @BoraZ Still looking forward to reading your resulting story, of course

Me: @BoraZ: @LeeBillings you can add longer responses on FriendFeed: http://friendfeed.com/coturnix that’s what it’s for

@1seahorse1: @BoraZ Do you mean that I have to be nostalgic about my ape tribe and life in caves ? 🙂

@TyeArnett: @BoraZ parsing data can be as dangerous as parsing words sometimes

@ccziv: @BoraZ Do not underestimate or devalue the importance of words, ever.

This shows that different people have very different ideas what ‘investigative reporting’ is and have even more difficulty figuring out how that applies to science! Let’s go nice and slow now, explore this a little bit more.
First, I think that what Dave meant in his first tweet –

@davemunger: @BoraZ To me, it means going beyond looking at a single study to really understand a scientific concept. Diff from traditional “inv. journo”

– is not ‘investigative reporting’ but ‘news analysis’ (again, see my attempt at classification), something akin to ‘explainers’ done occasionally by the mainstream media (think of This American Life on NPR and their ‘Giant Pool of Money‘ explainer for a great recent example). It is an equivalent of a Review Article in a scientific journal, but aimed at a broader audience and not assuming existing background knowledge and expertise.
The different worlds of journalists and scientists
This discussion, as well as many similar discussions we had in the past, uncovers some interesting differences between the way journalists and scientists think about ‘investigative’ in the context of reporting.
Journalists, when investigating, investigate people, almost exclusively. Scientists are much more open to including other things under this rubric, as they are interested in investigating the world.
Journalists focus almost entirely on words, i.e., what people say. In other words, they are interested mainly on the process and what the words reveal as to who is winning and who is losing in some imaginary (or sometimes real) game. Scientists are interested in results of the process, obtained by any means, only one of which is through people’s utterances – they are interested in investigating and uncovering the facts.
Journalists display an inordinate amount of skepticism – even deep cynicism – about anyone’s honesty. Everyone’s a liar unless proven not to be. Scientists, knowing themselves, knowing their colleagues, knowing the culture of science where 100% honesty and trust are the key, knowing that exposure of even the tiniest dishonesty is likely The End of a scientific career, tend to trust scientists a great deal more. On the other hand, scientists are deeply suspicious of people who do not abide by high standards of the scientific community, and The List of those who, due to track record, should be mistrusted the most is topped by – journalists.
This explains why scientists generally see Futurity.org as an interesting method of providing scientific information to the public, assuming a priori, knowing the track record of these institutions and what kind of reputation is at stake, that most or all of it will be reliable, while a journalist exclaims “propaganda”.
The Question of Trust
In this light, it is very instructive to read this post by a young science journalist, and the subsequent FriendFeed discussion of it. It is difficult for people outside of science to understand who is “inside” and thus to be trusted and who is not.
Those on the “inside”, the scientists, are already swimming in these waters and know instantly who is to be trusted and who not. Scientists know that Lynn Margolis was outside (untrusted) at first, inside (trusted) later and outside (untrusted) today again. Scientists know that James Lovelock or Deepak Chopra or Rupert Shaldrake are outside, always were and always will be, and are not to be trusted. Journalists can figure this out by asking, but then they need to figure out whose answer to trust! Who is inside and trusted to say who else is inside and trusted? If your first point of entry is the wrong person, all the “sources” you interview will be wackos.
Unfortunately the mistrust by journalists is often ‘schematic’ – not based on experience or on investigating the actual facts. They have a schema in their minds as to who is likely to lie, who is likely to use weaselly language, who can generally be trusted, etc. They use this rule-of-thumb when interviewing criminals, corrupt cops (“liars”), politicians, lawyers, CEOs (“weaselly words”), other journalists (“trustworthy”) and yes, scientists (“suspicious pointy-heads with hard-to-uncover financial motives”).
The automatic use of such “rule” is why so many D.C. reporters (so-called Village) did not understand (and some still do not understand) that someone who is supposed to be in the “use weaselly language” column – the politicians – should actually have been in the “lying whenever they open their mouths” column for eight years of the Bush rule (or, to be fair, the last 30 years). It did not occur to them to fact-check what Republicans said and hastily move them to the appropriate “chronic liars” category and report appropriately. They could not fathom that someone like The President would actually straight-out lie. Every sentence. Every day. Nobody likes being shown to be naive, but nobody likes being lied to either. Their need for appearance of savviness (the opposite of naive), for many of them, over-rode the need to reveal they’ve been lied to and fell for it (“What are you saying? Can’t be possible. They are such nice guys when they pat my back at a cocktail party over in The Old Boys Club Cafe – they wouldn’t lie to me!”). And many in their audience are in the same mindset – finding it impossible (as that takes courage and humility) to admit to themselves that they were so naive they fell for such lies from such high places (both the ruling party and their loyal stenographers). And we all suffered because of it.
The heavy reliance on such rules or mental schemas by journalists is often due to their self-awareness about the lack of knowledge and expertise on the topic they are covering. They just don’t know who to trust, because they are not capable of uncovering the underlying facts and thus figure out for themselves who is telling the truth and who is lying (not to mention that this would require, gasp, work instead of hanging out at cocktail parties). To cover up the ignorance and make it difficult for it to be revealed by the audience, they strongly resist the calls to provide the links to more information and especially to their source documents.
Thus He Said She Said journalism is a great way for them to a) focus on words, people, process and ‘horse-race’ instead of facts, b) hide their ignorance of the underlying facts, c) show their savvy by “making both side angry” which, in some sick twist, they think means they are doing a good job (no, that means all readers saw through you and are disgusted by your unprofessionalism). Nowhere does that show as clearly as when they cover science.
A more systematic investigation into ‘investigation’
Now that I raised everyone’s ire, let me calm down again and try to use this blog post the way bloggers often do – as a way to clarify thoughts through writing. I am no expert on this topic, but I am interested, I read a lot about it, blog about it a lot, and want to hear the responses in the comments. Let me try to systematize what I think ‘investigative reporting’ is in general and then apply that to three specific cases: 1) a scientist investigating nature and reporting about it in a journal, 2) a journalist investigating scientists and their work and reporting about it in a media outlet, and 3) a science blogger investigating the first two and reporting how good or bad job each one of them did.
A few months ago, I defined ‘investigative journalism’ like this:

Investigative reporting is uncovering data and information that does not want to be uncovered.

Let’s see how that works in practice.
Steps in Investigative Reporting:
1) Someone gets a hunch, wiff, a tip from someone or an intuition (or orders from the boss to take a look) that some information exists that is hidden from the public.
2) That someone then uses a whole suit of methods to discover that secret information, often against the agents that resist the idea of that information becoming available to the public.
3) That someone then puts all of the gathered information in one place and looks for patterns, overarching themes, connections and figures out what it all means.
4) That someone then writes an article, with a specific audience in mind, showing to the public the previously secret information (often including all of it – the entire raw data sets or documents or transcripts) and explaining what it means.
5) That someone then sends the article to the proper venue where it undergoes an editorial process.
6) If accepted for publication, the article gets published.
7) The article gets a life of its own – people read (or listen/view) it, comment, give feedback, or follow up with investigation digging up more information that is still not public (so the cycle repeats).
Case I: Scientist
1) Someone gets a hunch, wiff, a tip from someone or an intuition (or orders from the boss to take a look) that some information exists that is hidden from the public.
The keeper of the secret information is Nature herself. The researcher can get a hunch about the existence of hidden information in several different ways:
– delving deep into the literature, it becomes apparent that there are holes – missing information that nobody reported on yet, suggesting that nobody uncovered it yet.
– doing research and getting unexpected results points one to the fact that there is missing information needed to explain those funky results.
– going out into nature and observing something that, upon digging through the literature, one finds has not been explained yet.
– getting a photocopy of descriptions of three experiments from the last grant proposal from your PI with the message “Do this”. Great method for introducing high school and undergraduate students into research, and perhaps to get a brand new Masters student started (of course, regular discussions of the progress are needed). Unfortunately, some PIs continue doing this to their PhD students and even postdocs, instead of giving them freedom of creativity.
2) That someone then uses a whole suit of methods to discover that secret information, often against the agents that resist the idea of that information becoming available to the public.
The scientific method includes a variety of methods for wresting secret information out of Nature: observations, experiments, brute-force Big Science, natural experiments, statistics, mathematical modeling, etc. It is not easy to get this information from Nature as she resists. One has to be creative and innovative in designing tricks to get reliable data from her.
3) That someone then puts all of the gathered information in one place and looks for patterns, overarching themes, connections and figures out what it all means.
All the collected data from a series of observations/experiments are put together, statistically analyzed, visualized (which sometimes leads to additional statistical analyses as visualization may point out phenomena not readily gleaned from raw numbers) and a common theme emerges (if it doesn’t – more work needs to be done).
4) That someone then writes an article, with a specific audience in mind, showing to the public the previously secret information (often including all of it – the entire raw data sets or documents or transcripts) and explaining what it means.
There are three potential audiences for the findings of the research: experts in one’s field, other scientists, and lay audience (which may include policy-makers or political-action organizations, or journalists, or teachers, or physicians, etc.).
The experts in one’s field are the most important audience for most of research. The proper venue to publish for this audience is a scientific journal of a narrow scope (usually a society journal) that is read by all the experts in the same field. The article can be dense, using the technical lingo, containing all the information needed for replication and further testing of the information and should, in principle, contain all the raw data.
The scientific community as a whole as the target audience is somewhat baffling – on one hand, some of them are also experts in the field, on the other hand, all the rest are essentially lay audience. It is neither-nor. Why target scientific community as an audience then? Because the venue for this are GlamourMagz and publishing in these is good for one’s career and fame. The format in which such papers are written is great for scientists in non-related disciplines – it tells a story, but it is extremely frustrating for same-field researchers as there is not sufficient detail (or data) to replicate, re-test or follow-up on the described research. Publishing this way makes you known to a lot more scientists, but tends to alienate your closests colleagues who are frustrated by the lack of information in your report.
The lay audience is an important audience for some types of research – ones that impact people’s personal decisions about their health or about taking care of the environment, ones that can have impact on policy, ones that are useful to know by health care providers or science educators, or ones that are so cool (e.g., new fossils of dinosaurs or, erm…Ida) that they will excite the public about science.
Many scientists are excellent and exciting communicators and can speak directly to the audience (online on blogs/podcasts/videos or offline in public lectures or science cafes), or will gladly accept to do interviews (TV, radio, newspapers, magazines) about their findings. Those researchers who know they are not exciting communicators, or do not like to be in public, or are too busy, or have been burned by the previous interactions with the media, tend to leave the communication to lay audience to professionals – the press officers at their institutions.
While we have all screamed every now and then at some blatantly bad press releases (especially the titles imposed by the editors), there has been generally a steady, gradual improvement in their quality over the years. One of the possible explanations for this is that scientists that fall out of the pipeline as there are now so many PhDs and so few academic jobs, have started replacing English majors and j-school majors in these positions. More and more institutions now have science-trained press officers who actually understand what they are writing about. Thus, there is less hype yet more and better explanation of the results of scientific investigation. Of course, they tend to be excellent writers as well, a talent that comes with love and practice and does not necessitate a degree in English or Communications.
5) That someone then sends the article to the proper venue where it undergoes an editorial process.
The first draft of the article is usually co-written and co-edited by a number of co-authors who “peer-review” each other during the process. That draft is then (2nd peer-review) usually given to other lab-members, collaborators, friends and colleagues to elicit their opinion. Their feedback is incorporated into the improved draft which is then sent to the appropriate scientific journal where the editor sends the manuscript to anywhere between one and several experts in the field, usually kept anonymous, for the 3rd (and “official”) peer-review. This may then go through two or three cycles before the reviewers are satisfied with the edits and changes and recommend to the editor that the paper be published (or not, in which case the whole process gets repeated at lesser and lesser and lesser journals…until the paper is either finally published or abandoned or self-published on a website).
6) If accepted for publication, the article gets published.
Champaign time!
Then, next morning, back to the lab – trying to uncover more information.
7) The article gets a life of its own – people read (or listen/view) it, comment, give feedback, or follow up with investigation digging up more information that is still not public (so the cycle repeats).
After Nature closely guarded her secrets for billions of years, and after intrepid investigators snatched the secret information from her over weeks, months, years or decades of hard and creative work, the information is finally made public. The publication date is the date of birth for that information, the moment when its life begins. Nobody can predict what kind of life it will have at that point. It takes years to watch it grow and develop and mature and spawn.
People download it and read it, think about it, talk about it, interact with it, blog about it and, most importantly, try to replicate, re-test and follow up on the information in order to uncover even more information.
If that is not ‘investigative reporting’ at its best, I don’t know what is.
Case II: Science Journalist
1) Someone gets a hunch, wiff, a tip from someone or an intuition (or orders from the boss to take a look) that some information exists that is hidden from the public.
The hidden information, in this case, is most likely to be man-made information – documents, human actions, human words. It is especially deemed worthy of investigation if some wrong-doing is suspected.
2) That someone then uses a whole suit of methods to discover that secret information, often against the agents that resist the idea of that information becoming available to the public.
As the journalist cannot “go direct” and investigate nature directly (not having the relevant training, expertise, infrastructure, funding, manpower, equipment, etc.), the only remaining method is to investigate indirectly. The usual indirect method for journalists is to ask people – a very, very, very unreliable way of getting information.
Since investigating the facts about nature is outside the scope of expertise of journalists, they usually investigate the behavior and conduct of scientists. This is “investigative meta-science reporting”. In a sense, there is not much difference between investigating potential misconduct of scientists and misconduct of any other group of people. The main difference is that the business of science is facts about the way the world works, thus knowing who got the facts right and who got the facts wrong is important and who misrepresents lies as facts is even more important.
Unfortunately, due to lack of scientific expertise, journalists find this kind of investigation very difficult – they have to rely on the statements of scientists as to the veracity of other scientists’ facts or claims – something they are not in position to verify directly. If they ask the wrong person – a quack, for example – they will follow all the wrong leads.
Thus, the usual fall-back is HeSaidSheSaid model of journalism, reporting who said what, not committing to any side, not evaluating truth-claims of any side, and hoping that (also science-uneducated) audience will be able to figure it out for itself.
Since they cannot evaluate the truth-claims about Nature that scientists make, journalists have to use proxy mechanisms to uncover misconduct, e.g., discover other unseemly behaviors by the same actors, unrelated to the research itself. Thus discovering instances of lying, or financial ties, is the only way a journalist can start guessing as to who can be trusted, and then hope that the person who lies about his/her finances is also lying about facts about Nature – a correlation that is hard to prove and is actually quite unlikely except in rare instances of industry/lobby scientists-for-hire.
The actual research misconduct – fudging data, plagiarism, etc – can be uncovered only by other scientists. And they do it whenever they suspect it, and they report the findings in various ways. The traditional method of sending a letter to the editor of the journal that published the suspect paper is so ridiculously difficult that many are now pursuing other venues, be it by notifying a journalist, or going direct, on a blog, or, if the journal is enlightened (COI – see my Profile), by posting comments on the paper itself.
3) That someone then puts all of the gathered information in one place and looks for patterns, overarching themes, connections and figures out what it all means.
Once all the information is gathered in one place, any intelligent person can find patterns. Scientific expertise is not usually necessary for this step. Thus, once the journalists manages to gather all the information (the hard part), he/she is perfectly capable of figuring out the story (the easy part).
4) That someone then writes an article, with a specific audience in mind, showing to the public the previously secret information (often including all of it – the entire raw data sets or documents or transcripts) and explaining what it means.
Journalist advantage – they tend to be good with language and writing a gripping story. If the underlying information is correct, and the conclusions are clear, and the journalist is not afraid to state clearly who is telling the truth and who is lying, the article should be good.
5) That someone then sends the article to the proper venue where it undergoes an editorial process.
The editor who comes up with titles usually screws up this step. Otherwise, especially if nobody cuts out important parts due to length limits, the article should be fine. Hopefully, the venue targets the relevant audience – either experts (who can then police their own) or general public (who can elicit pressure on powers-that-be).
6) If accepted for publication, the article gets published.
Deadline for the next story looms. Back to the grind.
7) The article gets a life of its own – people read (or listen/view) it, comment, give feedback, or follow up with investigation digging up more information that is still not public (so the cycle repeats).
Now that the information is public, people can spread it around (e-mailing to each other, linking to it on their blogs, social networks, etc.). They bring in their own knowledge and expertise and provide feedback in various venues and some are motivated to follow up and dig deeper, perhaps uncovering more information (so the cycle repeats).
Most of science journalism is, thus, not investigative journalism. Most of it is simple reporting of the findings, i.e., second-hand reporting of the investigative reporting done by scientists (Case I). Or, as science reporters are made so busy by their editors, forced to write story after story in rapid succession, stories about many different areas of science, most science reporting in the media is actually third-hand reporting: first-hand was by scientists in journals, second-hand was by press officers of the institutions, and the journalist mainly regurgitates the press releases. As in every game of Broken Telephones/Chinese Whispers , the first reporter is more reliable then the second one in line who is more dependable than the third one and so on. Thus a scientist “going direct” is likely to give a much more reliable account of the findings than the journalist reporting on it.
There are exceptions, of course. Each discussion of science journalism always brings out commenters who shout the names of well-known and highly respectes science journalists. The thing is, those people are not science reporters. They are science journalists only in the sense that ‘Science Writers’ is a subset of the set ‘Science Journalists’. This is a subset that is very much in a privileged position – they are given freedom to write what, when, where and how they want. Thus, over many years, they develop their own expertise.
Carl Zimmer has, over the years, read so many papers, talked to so many experts, and written so many books, articles and blogposts, that he probably knows more about evolution, parasites and E.coli than biology PhDs whose focus is on other areas of biology. Eric Roston probably knows more about carbon than many chemistry PhDs. These guys are experts. And they are writers, not reporters. They do not get assignments to write many stories per week on different areas of science. They are not who I am talking about in this post at all.
Do they do investigative reporting? Sometimes they do, but they chose other venues for it. When George Will lied about climate change data in a couple of op-eds, Carl Zimmer used his blog, not the NYTimes Science section, to dig and expose the facts about the industry and political influences, about George Will’s history on the issue, about cowardly response by Washington Post to the uncovering of these unpleasant facts, etc.
Rebecca Skloot did investigative journalism as well, over many years, and decided to publish the findings in a form of a book, not in a newspaper or magazine. That is not the work of a beat reporter.
Case III: Science Blogger
1) Someone gets a hunch, wiff, a tip from someone or an intuition (or orders from the boss to take a look) that some information exists that is hidden from the public.
Bloggers are often looking for blogging materials from two distinctly different sources: the Tables of Content of scientific journals in the fields they have expertise in, and services that serve press releases (e.g., EurekAlert, ScienceDaily, etc.). They are also usually quite attuned to the mass media, i.e., they get their news online from many sources instead of reading just the local paper.
What many bloggers do and are especially good at doing is comparing the work of Case I and Case II investigative reporters. They can access and read and understand the scientific paper and directly compare it to the press releases and the media coverage (including the writings by other bloggers). Having the needed scientific expertise, they can evaluate all the sources and make a judgment on their quality.
Sometimes the research in the paper is shoddy but the media does not realize it and presents it as trustworthy. Sometimes the paper is good, but the media gets it wrong (usually in a sensationalist kind of way). Sometimes both the paper and the media get it right (which is not very exciting to blog about).
2) That someone then uses a whole suit of methods to discover that secret information, often against the agents that resist the idea of that information becoming available to the public.
Replicating experiments and putting that on the blog is rare (but has been done). But digging through the published data and comparing that to media reports is easy when one has the necessary expertise. Consulting with colleagues, on the rare occasions when needed, is usually done privately via e-mail or publicly on places like FriendFeed or Twitter, and there is no need to include quotes in the blog post itself.
Bloggers have done investigative digging in a journalistic sense as well – uncovering unseemly behavior of people. I have gathered a few examples of investigative reporting by science bloggers before:

Whose investigative reporting led to resignation of Deutch, the Bush’s NASA censor? Nick Anthis, a (then) small blogger (who also later reported on the Animal Rights demonstrations and counter-demonstrations in Oxford in great detail as well).
Who blew up the case of plagiarism in dinosaur palaenthology, the so-calles Aetogate? A bunch of bloggers.
Who blew up, skewered and castrated the PRISM, the astroturf organization designed to lobby the Senate against the NIH Open Access bill? A bunch of bloggers. The bill passed.
Remember the Tripoli 6?
Who pounced on George Will and WaPo when he trotted out the long-debunked lie about global warming? And forced them to squirm, and respond, and publish two counter-editorials? A bunch of bloggers.
Who dug up all the information, including the most incriminating key evidence against Creationists that was used at the Dover trial? A bunch of bloggers.
And so on, and so on, this was just scratching the surface with the most famous stories.

3) That someone then puts all of the gathered information in one place and looks for patterns, overarching themes, connections and figures out what it all means.
This is often a collective effort of multiple bloggers.
4) That someone then writes an article, with a specific audience in mind, showing to the public the previously secret information (often including all of it – the entire raw data sets or documents or transcripts) and explaining what it means.
The target audience of most science blogs is lay audience, but many of the readers are themselves scientists as well.
5) That someone then sends the article to the proper venue where it undergoes an editorial process.
Most blogs are self-edited. Sending a particularly ‘hot’ blog post to a couple of other bloggers asking their opinion before it is posted is something that a blogger may occasionally do.
6) If accepted for publication, the article gets published.
Click “Post”. That easy.
7) The article gets a life of its own – people read (or listen/view) it, comment, give feedback, or follow up with investigation digging up more information that is still not public (so the cycle repeats).
Feedback in comments usually comes in really fast! It is direct, straightforward and does not follow the usual formal kabuki dance that ensures the control and hierarchy remains intact in more official venues.
Other bloggers may respond on their own blogs (especially if they disagree) or spread the link on social networks (especially if they agree).
If many bloggers raise hell about some misconduct and persist in it over a prolonged periods of time, this sometimes forces the corporate media to pick up the hot-potato story despite the initial reluctance to do so. But this applies to all investigative reporting on blogs, not just science.
Also, bloggers are not bound by 20th century journalistic rules – thus the exposure by impersonation, what the conservative activists did to ACORN, is perfectly legitimate way of uncovering dirt in informal venues, but not legit in corporate venues.
One more point that needs to be made here. Different areas of science are different!
Biomedical science is a special case. It is huge. It has huge funding compared to other areas, yet not sufficient to feed the armies of researchers involved in it. It attracts the self-aggrandizing type disproportionately. Much is at stake: patents, contracts with pharmaceutical industry, money, fame, Nobel prizes… Thus it is extremely competitive. It also uses laboratory techniques that are universal and fast, thus it is easy to scoop and get scooped, which fosters the culture of secrecy. It suffers from CNS disease (necessity to publish in GlamourMagz like Cell, Nature and Science). It gets inordinate proportion of media (and blog) attention due to relevance to human health. All those pressures make the motivation to fudge data too strong for some of the people involved – very few, for sure, out of 10,000s involved.
On the other end of the spectrum is, for example, palaeontology. Very few people can be palaeontologists – not enough positions and not enough money. There is near-zero risk of getting scooped as everyone knows who dug what out, where, during which digging season (Aetogate, linked above, was a special case of a person using a position of power to mainly scoop powerless students). Your fossil is yours. The resources are extremely limited and so much depends on luck. Discovering a cool fossil is not easy and if you get your hands on one, you have to milk it for all it’s worth. You will publish not one but a series of papers. First paper is a brief announcement of the finding with a superficial description, the second is a detailed description, the third is the phylogenetic analysis, the fourth focuses on one part of the fossil that can say something new about evolution, etc. And you hope that all of this will become well-known to the general public. The palaeo community is so small, they all already know. They will quibble forever with you over the methodology and conclusions (so many assumptions have to go into methods that analyze old, broken bones). It is the lay audience that needs to be reached, by any means necessary. Many paleontologists don’t even work as university professors but are associated with museums, nature magazines, or are freelancing. The pressure to publish in GlamourMagz is there only as a means to get the attention of the media, not to impress colleagues or rise in careers.
Most of science and most scientists, on the other hand, do not belong to one of these two fields and do not work at high-pressure universities. They do science out of their own curiosity, feel no pressure to publish a lot or in GlamourMagz, do not fear scooping, are open and relaxed and have no motivation to fudge data or plagiarize. They know that the reputation with their peers – the only reputation they can hope to get – is dependent entirely on immaculate work and behavior. Why keep them suspect because two media-prominent sub-sub-disciplines sometimes produce less-than-honest behavior? Why not trust that their papers are good, their press releases correct, their blogging honest, and their personal behavior impeccable? I’d say they are presumed innocent unless proven guilty, not the other way around.
I’d like to see an equivalent of Futurity.org for state universities and small colleges. What a delightful source of cool science that would be!
Update: blogging at its best. After a couple of hit-and-run curmudgeounly comments posted early on, this post started receiving some very thoughtful and useful comments (e.g., especially one by David Dobbs) that are edifying and are helping me learn – which is the point of blogging in the first place, isn’t it?

PLoS & Mendeley live on the Web! Science Hour with Leo Laporte & Dr. Kiki (video)

Leo Laporte and Kirsten Sanford (aka Dr.Kiki) interviewed (on Twit.tv) Jason Hoyt from Mendeley and Peter Binfield from PLoS ONE about Open Access, Science 2.0 and new ways of doing and publishing science on the Web. Well worth your time watching!

Research Triangle Park

My regular readers probably remember that I blogged from the XXVI International Association of Science Parks World Conference on Science & Technology Parks in Raleigh, back in June of this year.
I spent the day today at the headquarters of the Research Triangle Park, participating in a workshop about the new directions that the park will make in the future. It is too early to blog about the results of this session, though the process will be open, but I thought this would be a good time to re-post what I wrote from the June conference and my ideas about the future of science-technology parks – under the fold:

Continue reading