WORKFORCE campusreview. com. au
Rank and guile
Debate has long raged over journal rankings systems and their sway over academic careers; here, a panel assembled from across the seniority spectrum gives arguments for and against the status quo and discusses strategies for success.
JASON SHARMAN PROFESSOR OF INTERNATIONAL RELATIONS, GRIFFITH UNIVERSITY
Whether it is for hiring, promotion, allocating scarce grant funds or any number of other purposes, there is a need to assess the quality of academics’ published research. There are other ways of allocating such rewards in academia – by family connections or friendship networks, for example – but these do not meet meritocratic criteria.
From the inception of the Howard government’ s Research Quality Framework until shortly before the 2012 Excellence in Research Australia( ERA) round, journal rankings were used for this purpose. These rankings aroused much controversy at the time and since, but I believe they are a reasonably fair and objective way of assessing our scholarly research, relative to the available alternatives.
This‘ relative to’ proviso is important. Academics will and should be assessed on their research, the question is how.
The old system in Australia, which continues in some aspects of the funding formula, was simply to count the number of publications( books counting for five articles or chapters). This system was crazy and destructive in creating strong incentives to produce a large volume of mediocre work. It doesn’ t pass the laugh test overseas, nor should it in Australia. Critics of journal rankings should think about whether they really want to go back that system. In the absence of citation measures and journal rankings for many fields, ERA essentially leaves everyone( including the assessors) guessing as to what counts as a good, bad or indifferent publication. As a result, where we should have assessments that are transparent, replicable and accountable, we instead have one that is opaque, ad hoc and unaccountable.
Internationally, academics are assessed according to informal journal rankings. In the US, at least in my field( political science), academics at leading universities must publish with a few top journals and book presses to get tenure. Publications outside this charmed circle count for nothing, or perhaps are even a net negative. The UK has the Research Evaluation Framework, which ostensibly depends on a committee in each discipline reading nominated publications in order to score departments. In practice, however, it is a fairly open secret that both nominating departments and assessors use journals as a proxy of quality.
People will inevitably use shortcuts in assessing the quality of academics’ publications. Given this fact, we should strive for those that are public, produced by deliberation within the field, and can be applied to all equally. The alternatives encourage mediocrity, or serve to entrench the power of privileged insiders in a position to dispense patronage.
MARK CHOU ASSOCIATE PROFESSOR OF POLITICS, AUSTRALIAN CATHOLIC UNIVERSITY
For the first few years of my academic career, I largely ignored journal rankings. My approach was simply to publish in the journals I read and respected without much regard for how they ranked. In retrospect, I did this to my own detriment.
Rankings and metrics cast a disproportionate influence over an academic’ s career prospects. In some ways, where you publish has become more important than what you publish. This is particularly the case for early-career academics. One of the first things a hiring committee will do is assess an applicant’ s publication track record against journal metrics. How many A * or Q1 publications does the applicant have? Conversely, how many of the applicant’ s publications are placed in unranked outlets? Even at non-Go8 universities, which have traditionally emphasised teaching over research, rankings and metrics are increasingly used to reward and punish staff.
But playing the rankings game comes with its own frustrations. For instance, anyone familiar with the now-defunct ERA journal ranking, SCImago’ s journal rank or the Journal Citation Reports will know that rankings tend to fluctuate, sometimes yearly. Journals can be ranked as Q1 or A one year only to be classified as Q2 or C the following year. Researchers are now encouraged to place publications in appropriately ranked outlets. Yet this becomes tricky when what’ s
32