campusreview. com. au
INDUSTRY & RESEARCH
been around for a long time in academia but the intense pressures on publication are extreme nowadays. Academics are competing for small numbers of grants in highly competitive environments, and the way they compete is by publishing in specific journals. We know academics are awarded for publishing in journals. It contributes to their next grant or their tenure commission. And in some extreme cases, academics are financially rewarded, or at least their departments are financially rewarded, when they publish in these journals.
It contributes to people doing everything they can to get into them. You tend to prioritise publications in these journals over [ maintaining ] what might be considered more thoughtful research practices that perhaps are a bit slower.
If you have to compete to publish under extreme time pressure, there can be a [ temptation ] to cut corners. I think extreme data fabrication is quite rare. It focuses attention on the issue but the bigger problem is that academics are not, potentially, able to replicate their work. We know there is a good body of evidence that shows most studies are not able to be replicated, and that’ s not due to misconduct, per se, that’ s simply due to research practices that don’ t enable replication.
Would you say there’ s sort of a need for a culture change in research? I think there is, and I’ m quite heartened by the fact this discussion is now one that a lot of people are having. It’ s happening at institutional level, it’ s happening in the public, in the press and at quite a high level amongst funders as well. The repeating theme is that, if you want to have a culture where we can rely on scientific literature, we absolutely have to do everything possible to cause academics to [ create ] that.
The main thing is that we have to reward them for good practices. For example, reward them for making their data available to anyone to scrutinise. Reward them for publishing the statistical program they use to analyse their data. Anything – reward them for good reporting practices. Right now, none of that is rewarded. The only thing people get rewarded for is publication in that one particular journal. So you can see why it leads to a particular bias to publish there.
What are the consequences when, say, media picks up on a discredited study and runs it as a story? I think the media have to be careful. None of these stories is simple. I’ m very much against scapegoating or sort of holding people up as examples. I think it’ s most important that when the media report a story they ask what the institution is going to do next to make sure this doesn’ t happen again. What processes do they have in place to make sure young researchers are supported? What processes do they have in place to make sure their researchers are rewarded for making their data available, making their research reproducible?
And don’ t forget [ the ] co-authors. Those co-authors also need to be supported, because to some extent they are the unwitting victims in situations like this. [ Especially ] when their research is shown to be accurate, but they still suffer from the paper being retracted.
I think media focusing on these papers should be a wake-up call for institutions to try to put their houses in order as much as possible.
Is it possible to create a universal system of checks and balances for research? Quite a few people are working on this right now. For example, there are established reporting guidelines for publishing particular types of [ papers on ] humans. They come from a group called the Equator Initiative. These people have come up with guidelines for how you report research in a way that makes it easy to see at least what was done. So that’ s one initiative.
In the US, there’ s a movement led by a number of organisations that are working to produce guidelines on transparency. That’ s partly about how you report the data but it’ s also about making it available.
In Australia, the Australian National Data Service has done a lot of work towards providing infrastructure to make data available, so there are many people who are working together towards this. It’ s about making sure academics generally have the ability to link into these initiatives, and are rewarded for doing so.
What can journals do to help stop these dodgy scientific practices? My interest is from an access point of view, but I am also the Chair of the Committee on Publication Ethics, and that’ s a group of editors who work together to produce guidelines for best practice in how editors handle these types of issues. What the journals do in this area is pretty important as well.
The first thing they can do is have in place peer review, not just pre-publication but have processes for post-publication review in place if something does come to light afterwards. And have ways of correcting the literature, as the journal [ in this recent high-profile case ] did.
Also, make their editors and reviewers aware of all these guidelines, and also collaborate amongst journals. There is an interesting move now to have journals work together to try to produce good systems. In the past, that has led to things like the CrossCheck initiative, which is for plagiarism screening, and initiatives such as CrossMark, which tells you which version of the paper you are seeing.
All these types of things are important for journals to collaborate on and there is an increasing amount of that going on nowadays. ■
17