Scientific journals banned from listing ChatGPT as co-authors

Publishers of thousands of scientific journals have banned or restricted contributors from using ChatGPT out of concern that it could “ruin” scholarly literature with flawed, even fabricated research.

ChatGPT, the cutting-edge chatbot developed by artificial intelligence company OpenAI, has been impressive when it comes to creating poems, short stories, essays, and even personal advice since its launch in November 2022.

However, while ChatGPT proved to be a very interesting, remarkable success, there are still many concerns that the program can also produce fake scientific summaries convincing enough to judge and trick reviewers.

It is known that ChatGPT has been credited as a co-author on a number of articles. And this caused publishers to step in. According to The Guardian, Holden Thorp, Editor-in-Chief of the leading American scientific journal Science, announced a new editorial policy on February 2, banning the use of text from ChatGPT, and clarifying that the tool cannot be used. listed as author or co-author.

“Given the frenzy forming around the matter, we will not allow ChatGPT to author or use its text in articles,” Mr. Thorp said.

According to Mr. Thorp, leading scientific journals often require authors to sign a statement that they will be responsible for their contributions to research. Since ChatGPT can’t do this, it can’t be authored.

The editor-in-chief of Science believes that even using ChatGPT in preparing a scientific paper is problematic. ChatGPT makes many mistakes, he said, and if scientists rely on AI programming to prepare, review literature, or summarize research findings, then the relevant context of research work and scrutiny of research Possible outcomes will be lost.

Other publishers made similar changes. Springer-Nature, which publishes nearly 3,000 journals, earlier this week updated its guidelines, claiming ChatGPT cannot be listed as an author. But Springer-Nature doesn’t ban ChatGPT outright. ChatGPT and similar tools can still be used in the preparation of articles, as long as full details are disclosed in the manuscript.

“The sudden appearance of the ChatGPT tool as a co-author made it very important for us to respond,” said Magdalena Skipper, Editor-in-Chief of the multidisciplinary scientific journal Nature .

However, Ms. Skipper believes that ChatGPT and similar AI tools can benefit science with the right safeguards, especially in leveling the playing field for people whose English is not. native language.

Elsevier, which publishes about 2,800 medical and scientific journals, takes a similar stance to Springer-Nature. Elsevier’s guidelines allow for the use of AI tools “to improve the readability and language of a research paper, but does not replace the main tasks an author should perform, such as interpreting data or draw scientific conclusions. Authors must declare if they have used AI tools and how they are used.”

According to Mr. Michael Eisen, Editor-in-Chief of the scientific journal eLife , ChatGPT cannot be the author, but its application is inevitable. “I think the better question is not whether to allow ChatGPT or not, but how to manage the fact that it is being widely used. The most important thing, at least for now, is that authors must be upfront about its use and describe how it is used”.

Meanwhile, professor of technology at the University of Oxford (UK) Sandra Wachter, said: “It’s good to see publishers take action. ChatGPT will be especially worrying if suggested content goes unchecked. This can lead to misinformation and create a junk science environment. I think many other areas like education, arts and journalism will have to think about it. similar steps, because they are facing similar challenges.”

Related posts

GTA 6 is guaranteed to launch on time, Take-Two quashes delay rumors

Be wary of SteelFox malware attacking Windows using a copyright-cracking tool

Apple chose Foxconn and Lenovo to develop an AI server based on Apple Silicon