Fresh off a Nobel win, biologist Randy Schekman launches a high-profile boycott of the most prestigious academic journals, including Science, Nature, and Cell:
While they publish many outstanding papers, they do not publish only outstanding papers. Neither are they the only publishers of outstanding research. These journals aggressively curate their brands, in ways more conducive to selling subscriptions than to stimulating the most important research. Like fashion designers who create limited-edition handbags or suits, they know scarcity stokes demand, so they artificially restrict the number of papers they accept.
A paper can become highly cited because it is good science – or because it is eye-catching, provocative or wrong. Luxury-journal editors know this, so they accept papers that will make waves because they explore sexy subjects or make challenging claims. This influences the science that scientists do. It builds bubbles in fashionable fields where researchers can make the bold claims these journals want, while discouraging other important work, such as replication studies. In extreme cases, the lure of the luxury journal can encourage the cutting of corners, and contribute to the escalating number of papers that are retracted as flawed or fraudulent.
Tim Cross offers a measured endorsement:
Any scientist will tell you that Nature and Science publish a lot of excellent work. And Dr. Schekman is perhaps not the most disinterested observer: as he mentions in the article, he is the editor of eLife, an “open-access” journal (that is, one that does not charge readers) with ambitions to rival the top dogs. But working researchers will also tell you, perhaps after a few drinks, that Dr. Schekman is far from alone in thinking that the relentless focus on publishing in “high-impact” journals causes big distortions in how science is done. Many are reluctant to speak up, fearful of the damage they might cause to their careers by rocking the boat.
Going a big step further, Michael Eisen, co-founder of the open-access publisher PLOS, would do away with journals entirely:
[W]e need to listen to Schekman. Indeed we need to go one step ahead. While I admire what the new journal eLife is doing, where Schekman is the editor, it still rejects a lot of good papers that don’t meet the reviewers’ and editors’ standards of significance. This significance is determined by a scientist’s peers (in case of eLife) along with editors of journals (in case of Nature, Science, Cell and most other journals). But peer review isn’t devoid of problems. I believe that we need to dispense entirely with journals and with the idea that a few reviewers – no matter how wise –can decide how significant a work is at the time.
But Chris Lee doesn’t buy the argument:
Yes, the high-impact journals publish a lot of stuff that seems to lack substance. But the more specialist journals seem to publish a lot of rubbish, too. Maybe not so many overwrought claims of stunning importance, but there are plenty of statements of the bleeding obvious. Yes, if you find the right journal, you too can publish the finding that water is wet as original science.
The problem here runs deeper than the publishing industry and its unholy coupling to the business of doing science. It comes down to our definitions of science and scientific output. For instance, if I take the laws of physics and calculate the behavior of a light wave in some new material, that counts as science. But it’s really just a calculation. If I measure the behavior of light in the material and find that it agrees with the calculation, that counts as science. But it’s actually just finding that the laws of nature are still true. The reason for doing these two things usually turn out to be good, and the experiments and calculations are useful. But the findings aren’t exactly startling stuff. This is the mundane nature of much of science. It has to be done, it has a good and useful purpose—but it’s not the science that we envision when we’re at school. Yet due to the way in which scientific output is measured, each and every one of those calculations and measurements has to be published.
This means that we need to have a large number of publishing venues, and since counting publications isn’t enough, they have to be ranked. Therein lies the problem. Boycotting a journal or a system is not going to help. As long as we try and reduce the sum total of scientific performance to a few numbers, the system will be gamed, and everyone will try to optimize it. It’s unavoidable.