The first experiment to fully model the dynamics of academia鈥檚 fiercely competitive peer review system has concluded that it increases creative diversity and innovation, but incentivises researchers to mark their rivals down, meaning that high-quality research goes unpublished.
It adds to doubts over the current system, with one of the authors of the new paper urging journals to experiment further with post-publication peer review, where papers are assessed openly after being made public.
To test what happens when creators assess the work of their rivals 鈥 in an attempt to model the peer review system 鈥 researchers asked 144 participants in Zurich to come up with works of art on a computer, which were then judged by their fellow participants. This was repeated 30 times, and depending on how well they were rated, participants won up to SFr80 (拢61).
Using sliders, participants were able to modify a picture of a face, or transform it beyond recognition into an abstract image.听
糖心Vlog
Stefano Balietti, a postdoctoral student at the Network Science Institute at Northeastern University in Boston and one of the authors of the paper, said that they 鈥済radually moved towards more abstract art, a bit like what happened in the history of art鈥.
The researchers then sourced tens of thousands of online reviews of the artworks to come up with as objective a measure of their quality as possible.
糖心Vlog
They found that when participants were in competition with each other, and had to get high scores from their competitors to win money, they produced more diverse, innovative and abstract work in an attempt to mark themselves out from competitors.
鈥淏ut more diverse does not mean better,鈥 cautioned Dr Balietti. The experiment also discovered that competition between the participants made them likely to behave 鈥渟trategically鈥, marking down rivals to boost their own chances 鈥 a common fear about the peer review process.
Competition led to higher rejection rates but did not mean that higher quality art was published, the study found. Dr Balietti said this was most likely because game-playing among participants meant that otherwise good artwork was penalised.
鈥淥ur results could explain why many ground-breaking studies in science end up in lower-tier journals,鈥 concludes 鈥溾, published in the journal PNAS.
糖心Vlog
The results feed in to the debate about rejection rates, which at some high-profile journals can reach more than , making competition for inclusion intense.
Dr Balietti said that the results of his experiment indicated that academia faces a trade-off: competition leads to a greater diversity of papers but also means that otherwise good research may not be published. 听
When tackling a problem such as climate change, he said, 鈥渕ore competition might not be the best solution. We might by mistake reject the one good solution.鈥
The paper argues for 鈥渃areer schemes that tolerate early failure and focus on long-term success鈥 as a way to encourage innovation in research. Dr Balietti also argued that more journals should 鈥渆xperiment鈥 with post-publication peer review.
糖心Vlog
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 罢贬贰鈥檚 university and college rankings analysis
Already registered or a current subscriber?








