ERROR 1
ERROR 1
ERROR 2
ERROR 2
ERROR 2
ERROR 2
ERROR 2
Password and Confirm password must match.
If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)
ERROR 2
ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.
The email arrived midmorning on a Friday. “Thank you for your participation in our crowd reviewing (CR) program!” it read.
The case for crowd peer review
Cheery and succinct, the message notified more than 80 expert reviewers that a manuscript was waiting for them, ready to be critiqued and considered for publication in the journal Synlett. A link within the email led the potential reviewers to a website called Filestage. There, they could click on any portion of the manuscript to pop up a box to compose comments.
Half an hour after the message hit reviewers’ in-boxes, pale green circles dotted the screen where eager readers left suggestions and edits. Over the next four days, more than a dozen of the original 80 invitees anonymously debated the new chemical synthesis presented in the manuscript. They commented on the reaction’s novelty, questioned the molecular mechanisms proposed by the authors, suggested control experiments, pointed out spectra of samples with subpar purity, flagged typos, and most crucially, recommended rejection or acceptance. Once the review period closed, an editor weighed the reviewers’ comments and decided the manuscript’s fate.
In traditional peer review, a manuscript would be distributed to just a handful of anonymous reviewers, and each volunteer would write a formal, independent critique of the paper. This process can take up to several months. Crowd review can take as little as one week.
“I’m absolutely convinced crowd reviewing could revolutionize scientific manuscript and grant-proposal reviewing in the future,” says Synlett’s editor in chief, Benjamin List, who developed the novel crowd-reviewing system about two years ago, in conjunction with Denis Höfler, his former Ph.D. student at the Max Planck Institute for Kohlenforschung.
Since launching the experimental review system in the spring of 2017, Synlett has crowd reviewed 115 manuscripts and accepted 70 of them, making it one of the few chemistry journals, along with Thieme sister journal SynOpen and Copernicus Publications’ Atmospheric Chemistry & Physics, to move away from traditional peer review. Some chemists have embraced the shift, opting to have their manuscripts reviewed by the crowd. Authors tend to appreciate the speed with which crowd review can accept a paper, while reviewers like how the system lessens the burden on individual participants. Nevertheless, many researchers aren’t so sure about it, and few are clamoring to abandon traditional peer review. For most chemists who have never experimented with any type of alternative peer review, the concept of crowd review sounds alien, mysterious, and perhaps unnecessary.
To shed some light on crowd review, C&EN observed, firsthand, papers being evaluated. And we talked with editors, reviewers, and authors who have participated to get their take on whether the new system could change chemistry publishing.
At Synlett, the process starts with selecting a crowd. When launching the new system, List and other editors initially invited more than 100 synthetic chemists from their trusted networks to join. Since then, the editors have refreshed the crowd twice a year, replacing inactive members with new ones, some of whom asked to join, intrigued by the experiment. For particular manuscripts, editors sometimes add guest reviewers with special expertise to the crowd to help with the evaluation.
Overall, Synlett’s reviewer group now hovers around 80 members and includes scientists in industry, academic faculty of all levels, postdoctoral researchers, research associates, and even graduate students. Graduate students traditionally aren’t invited to review manuscripts, though many do reviews as a training exercise or as behind-the-scenes substitutes for busy faculty. Synlett’s crowd-review editor and a chemist at the University of Münster, Manuel van Gemmeren, who recently took over for Höfler, says that the journal’s crowd of reviewers is 81% male and 19% female. The crowd’s career levels suggest the average member is about 35 years old, he says.
One crowd reviewer, a research associate at a U.S. university, says the biggest benefit of being a crowd reviewer is the reduced workload. Although the two to four days in which a crowd reviewer must evaluate a manuscript is shorter than the time for other journals—even “rapid communication” journals, like the American Chemical Society’s Organic Letters, have five-to-seven-day review periods—you can simply pass on a manuscript if you’re busy, the reviewer says. C&EN agreed not to disclose the identities of Synlett reviewers to preserve the anonymity of the reviewer pool.
Manuscripts for crowd review arrive about once a week, and by the time a reviewer gets to it, a manuscript may already have lots of comments, the U.S. research associate says. Often commenters who get to the paper first leave longer comments. Editors encourage those who follow to simply write “agree” instead of rephrasing the same point. In that case, later reviewers might focus instead on another part of the paper or the supplementary material, essentially splitting up the work, the U.S. research associate says. The researcher estimates that they spend about one to two hours total reviewing each manuscript.
“Overall, the thoroughness seems about right,” says another crowd reviewer, who is an organic chemistry lecturer based in the U.K. The lecturer estimates that they comment on about half of all incoming manuscripts.
One advantage of the crowd is that it helps deter unfair reviews, which can occasionally happen in traditional peer review, the U.K. lecturer says. The crowd acts as a “free market of ideas” and generally comes to a sensible consensus, the U.K. lecturer adds. Disagreement regularly crops up over reviews, but it’s generally polite, likely because comments are open to the crowd rather than private. In the past two years, Höfler says, only two reviewers have been removed for leaving unconstructive feedback.
The general feedback from the crowd, authors say, isn’t so different from reviews received via traditional peer review. Oliver Reiser, an organic chemist at the University of Regensburg, submitted a manuscript last year that was refereed by crowd review. The crowd’s comments had about the same level of scientific detail as those obtained from traditional review, Reiser says. The responses were just shorter and less formal.
Synthetic chemist J. D. Tovar at Johns Hopkins University opted for combined crowd and traditional review, another option offered by Synlett, earlier this year. He was surprised that some crowd review comments provided more detail than those from a typical review. Commenters flagged small errors such as switched numbers that copyeditors wouldn’t necessarily have caught, he says.
That the crowd is limited to vetted experts and not just anyone with an internet connection is what convinced Tovar to try crowd review, he explains. Both Reiser and Tovar say that curiosity influenced their decision to test the crowd-reviewing option.
Christopher Vanderwal, an organic chemist at the University of California, Irvine, whose paper went through Synlett’s crowd review unintentionally because he forgot to opt out before the deadline, is slightly more skeptical of the process.
In addition to four traditional review reports, Vanderwal received a 76-page document compiling the crowd’s comments on his manuscript and supplementary material. Stunned, he emailed the editor managing his manuscript about what was required. The editor assured him that the journal didn’t expect a response to every single critique. After getting over the shock of the huge PDF, Vanderwal had a positive overall experience, he says, largely because the editor handled the situation well and the comments ended up improving the paper.
List, Synlett’s editor in chief, says authors are expected to make a reasonable effort to improve their manuscripts, and sometimes that may mean many changes. But editors are careful to not overburden authors, he says, and to help them prioritize the most important changes. With crowd review, “there’s a fair amount of governance and moderation by the editors,” says the University of Reading’s Laurence Harwood, editor in chief of the open access journal SynOpen, a synthetic chemistry journal that began offering crowd peer review as the default refereeing option in August. The journal uses the same Filestage platform as Synlett but draws on a separate and smaller crowd of about 35 experts.
Harwood, who is also an editor at Synlett, was dubious about crowd review at first. It wasn’t until he saw the system in action that he was won over by the responsiveness of the crowd. A longtime journal editor, Harwood says that many times he would receive conflicting reviewer recommendations, sometimes with only a line or two of justification. “Then I’ve got to decide which way to jump or send the manuscript out to more referees,” he says, which takes even more of his time. With crowd review, Harwood says, editors can see the general consensus while disregarding outliers if appropriate.
Philippa Cranwell, also at the University of Reading andSynOpen’s crowd-review editor, makes the initial decision of whether to accept or reject a manuscript according to the crowd’s comments. She says the ideal window for gathering comments from the crowd is two days. One day isn’t enough time for substantive comments to accumulate, whereas four days opens the manuscripts up to an overwhelming number of comments for authors. Synlett’s van Gemmeren says four-day review periods are typical when weekends are included.
Another open access journal, Atmospheric Chemistry & Physics (ACP), has combined crowd review with traditional review since 2001. Editors send manuscripts to two reviewers and simultaneously post the articles online for open discussion for eight weeks. Anyone who registers on the website can comment. The reviewers’ reports are also published online, sometimes signed with the person’s name, which many researchers in the life sciences advocate.
A 2012 study reported that about one in five ACP articles receives open-discussion comments, but because a handful of articles get lots of comments and skew the results, a typical paper’s comment rate is likely lower (Front. Comput. Neurosci. 2012, DOI: 10.3389/fncom.2012.00033). The top commented paper on the site is a 2016 study on climate change, which received more than 100 comments, for instance. Out of the 20,000 comments that had been posted on the site over the years, only a handful had been deleted because of inappropriate wording, according to the 2012 study. ACP Chief Executive Editor Ulrich Pöschl says that unlike a blog’s comment section, which can get overrun by trolls, ACP’s more formal commenting system, which generates a separate document for each comment, deters such scenarios.
Elizabeth A. Stone, an analytical chemist at the University of Iowa, has published 20 papers in ACP. She says the crowd-review comments she’s received have typically offered relevant and tangible feedback, most likely from someone working in the same field. While Stone says ACP’s main draw is the fact that it’s open access, she also likes being able to read her reviewers’ reports as soon as they come online. The real-time reviews let her start thinking about how to address the critiques as soon as they’re posted instead of waiting for all the reviews to be collected by an editor. Other researchers agree that open access plays a larger role than the mixed peer review in their decision to publish with ACP. They add that the posted review reports can provide useful additional information for scientists who are particularly interested in the paper.
But some chemists still seem unsure whether alternatives to traditional peer review are necessary. While the existing system isn’t perfect, Tovar says, the reviews he’s received have generally been sensible. Vanderwal agrees that traditional reviews have been mostly fair over his career but admits that others may feel differently. The U.K. lecturer, for instance, believes that traditional peer review is not working.
The case for crowd peer review
It may be too early to tell whether crowd review will catch on. Authors who spoke with C&EN emphasized that they had been through Synlett’s crowd-review experience just once. Reiser says he’d be open to submitting another manuscript to the crowd in the future, citing the fast turnaround as the main benefit. Though Vanderwal still has doubts, he appreciates that people are trying new approaches. “We’re all experimentalists at heart,” he says.
Crowd-review editors say the system could potentially work for any journal, even interdisciplinary ones, as long as reviewers with the right background are in the crowd. “Ultimately, the goal is to publish better papers,” List says. He concedes that shifting attitudes may take a long time, perhaps another generation. Right now he asks that those with reservations keep an open mind: “I would say they should just give it a chance. Try it with one manuscript, and you will be surprised how well this works.”
Join the conversation
Contact the reporter
Submit a Letter to the Editor for publication
Engage with us on Twitter