ERROR 1
ERROR 1
ERROR 2
ERROR 2
ERROR 2
ERROR 2
ERROR 2
Password and Confirm password must match.
If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)
ERROR 2
ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.
When David Moher, the founding director of the Centre for Journalology at the Ottawa Hospital Research Institute (OHRI), proposed creating the organization, he felt that most researchers were unaware of best publication practices. Seven years on, that hasn’t changed.
▸ Established: 2014
▸ Located: Ottawa Hospital Research Institute
▸ Number of team members: 15
▸ Director: David Moher, senior scientist at the Ottawa Hospital Research Institute and professor at the University of Ottawa’s School of Epidemiology and Public Health
▸ Research areas: Predatory journals, research reporting, and researcher assessment
▸ Partnerships: Enhancing the Quality and Transparency of Health Research (EQUATOR) Network, Committee on Publication Ethics (COPE)
In 2014, the OHRI launched the center with some internal funding, aiming to shed light on metascience, the use of scientific techniques to study the research process itself. The center now is home to an array of experts with various backgrounds, including preclinical researchers, clinical epidemiologists, social psychologists, methodologists, behavioral scientists, and metrics literacy specialists. The organization has three pillars: performing research on the subfields of metascience, developing a course to teach the subject, and conducting outreach about the topic to stakeholders such as universities, researchers, journals, and policy makers.
Moher, an epidemiologist, says he first recognized the importance of training researchers on scientific best practices when he was working on systematic reviews in medicine and had to leave out large numbers of poor-quality papers.
In turn, the Centre for Journalology aims to improve science and make scientific publishing more robust and rigorous.
In late 2019, for instance, the center unveiled a definition of predatory publishing. To do that, center researchers looked at the characteristics of predatory journals and compiled a review analyzing papers about predatory journals to identify common characteristics of such publications. The center then brought together 43 experts, who engaged in 12 h of discussion followed by two rounds of revision and feedback to reach consensus: “Predatory journals and publishers are entities that prioritize self-interest at the expense of scholarship and are characterized by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggressive and indiscriminate solicitation practices.”
The definition “is quite important and a fairly impactful starting point for predatory journal research because in the absence of a definition of what these journals are, it’s really difficult to study them,” says Kelly Cobey, a social psychologist who is an investigator at the center.
The center is creating an online resource that it hopes will help people involved in publishing gather information about predatory journals. The center is also building a digital tool that researchers can use to authenticate journals they are considering publishing in, reading, or citing. “It would give them that information to make a more complete decision,” Cobey says.
Moher and colleagues have also played a key part in developing the Consolidated Standards of Reporting Trials, or CONSORT, statement, which contains a 25-item checklist for designing, analyzing, and interpreting randomized controlled clinical trials. The CONSORT statement aims to address inadequate reporting of clinical trials, a long-standing issue in academia and industry. It “has definitely improved the situation, but it is certainly not perfect,” Moher says.
Other reporting guidelines created by the center include Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) and Enhancing the Quality and Transparency of Health Research (EQUATOR) Network. “There’s a ton of evidence that inadequate reporting of randomized trials is associated with biasing the estimates of how good a drug or an intervention is,” Moher says. “That’s why we do it, because we don’t want to give patients and clinicians biased results.”
The guidelines have earned the endorsement of a number of journals and institutions. “Hundreds and hundreds of journals have changed their instructions to tell people to use reporting guidelines,” Moher says. “That’s an impact.”
The center has also analyzed the core competencies required of biomedical editors to evaluate papers, and it has examined policies and characteristics of different preprint servers in the medical and biomedical sciences. It also played a role in the development of the Hong Kong Principles for fostering research integrity.
Still, Moher acknowledges that the center has not fully engaged with policy makers—the research funders, journal editors, and institutional leaders who have the power to change norms. He cites the need to gather evidence before suggesting any changes to the status quo. “We have the evidence in lots of areas now, and I think what we need to do is actually try to work much closer with policy people now,” he says. Part of that effort involves putting together an open-science dashboard to help institutions keep track of their own accomplishments and shortcomings in how accessible their research is.
The center is also building collaborations with institutions to help them implement open-science practices. An example, Moher says, is the Montreal Neurological Institute, where the Centre for Journalology is auditing data-sharing practices and introducing an educational program to make data sharing the norm. If that effort is successful, “we’ll slowly start to introduce other open-science practices,” he says.
Moher says the COVID-19 pandemic has drawn attention to how much time, money, and other resources in scholarly research are wasted. Cobey adds that the pandemic has also highlighted the poor implementation of open-science policies that already exist at the federal, institutional, and journal levels, largely due to lack of auditing of policies and of training for researchers. “People, I think, are well-intended, but they’re not aware of how to do these things better,” she says.
The pandemic has also pinpointed the need to engage patients, who are a key part of the medical and health research ecosystem. “Patients use knowledge to inform their decision-making,” Moher says. “If that knowledge is rubbish, biased, problematic, unclear, and not transparent—that, then, is a problem.”
Going forward, Moher and Cobey say, the biggest hurdle is attracting funding in this area of research, despite the fact that metascience could improve how science is conducted across many fields. They believe that agencies should be more willing to fund such work in the interest of getting the most bang for their buck. Moher asks, “Why wouldn’t a funder want to know whether what they’re funding is of the highest possible quality?”
Dalmeet Singh Chawla is a freelance writer based in the UK.
Join the conversation
Contact the reporter
Submit a Letter to the Editor for publication
Engage with us on X