ERROR 1
ERROR 1
ERROR 2
ERROR 2
ERROR 2
ERROR 2
ERROR 2
Password and Confirm password must match.
If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)
ERROR 2
ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.
Many scholarly publishers still don’t have policies on whether researchers can submit papers written by artificial intelligence chatbots like ChatGPT.
Jeremy Y. Ng, a metascientist who works at the Centre for Journalology at the Ottawa Hospital Research Institute, and colleagues audited the publicly available policies of 163 members of the International Association of Scientific, Technical, and Medical Publishers (STM). The researchers report in a preprint article—published before peer review—that of those 163 members, only 56 had a policy on whether authors can submit papers written by AI chatbots (medRxiv 2024, 10.1101/2024.06.19.24309148). Forty-nine of those 56 publishers required authors to declare when they used chatbots, and none allowed researchers to list AI tools like ChatGPT as an author—which has happened in some cases.
“The use of AI chatbots in academic publishing is a new and rapidly evolving space,” Ng says. “The absence of industry-wide standards or guidelines also contributes to the slow adoption, as publishers may be hesitant to implement policies without a clear framework to follow.”
Four of the publishers surveyed had an outright ban on using chatbots. But one of those, the American Association for the Advancement of Science, which publishes Science, has since reversed its ban.
The analysis revealed that 19 surveyed publishers said researchers should not cite chatbots as primary sources. Eighteen allowed the use of chatbots in research methods such as for data organization, and 33 publishers permitted researchers to use chatbots to help write non-methods sections, including the background and introduction of manuscripts. Fourteen publishers said authors could use AI to generate images, while 15 allowed authors to use chatbots to proofread manuscripts.
The American Chemical Society and the Royal Society of Chemistry, both of which are STM members, don’t permit chatbots to be listed as authors but require researchers to acknowledge when they use chatbots in preparing manuscripts. (ACS also publishes C&EN but is not involved in editorial decisions.)
“Every publisher should have a policy on the attribution and use of these automated tools because they are attractive to researchers, but their use has so many caveats,” says Matt Hodgkinson, ethics adviser at MyRA, a generative AI–powered app that suggests key themes from uploaded texts such as interview transcripts.
But Hodgkinson questions whether the numbers in the study are accurate. Some academic publishers aren’t STM members, he says, and many STM members are trade bodies and vendors rather than publishers and so won’t have policies on AI chatbots. “The authors [of the study] need to check every included ‘journal publisher,’ and accounting for these issues will seriously shift the percentages,” he says.
Hodgkinson says a key missing part of the study is the use of generative AI by peer reviewers or editors. Some studies have already demonstrated that peer reviewers may be using ChatGPT. “This is a major worry for the validity and rigor of peer review,” he says.
In previous work, Ng and his team found that many authors may not be familiar with AI chatbots, perhaps because of a lack of training. Ng suspects that as a result, researchers may also be unfamiliar with publishers’ policies on generative AI. “Continuous work in this field is crucial to develop evidence-based policies that ensure ethical standards, transparency, and quality are maintained,” Ng says.
This article was updated on Aug. 14, 2024, to correct the description of the MyRA app. It was not built by the firm Clarivate.
Join the conversation
Contact the reporter
Submit a Letter to the Editor for publication
Engage with us on X