Evolving journal guidelines for use of AI


Reading time
7 mins
Evolving journal guidelines for use of AI

Artificial Intelligence (AI) tools can be used at different stages of academic or creative writing to edit, format, correct, and even create content. Generative AI can create text, images, other media, and synthetic data. Multiple generative AI tools exist today, including OpenAI's GPT, Google's Bard, and Meta's Llama. GPT has various versions, one of which is GPT-3.5 — widely known as ChatGPT.  

Although AI models can be great writing tools, most scientific journal publishers, including Springer Nature, Elsevier, Taylor & Francis, JAMA, MDPI, etc., preclude AI authorships.1–5 The primary concern behind preventing this across journals is the responsibility required for authorship. AI tools do not necessarily qualify as accountable for the manuscript. When AI tools are used, most publishers ask for the proper mention of AI usage in the manuscript. They also mention that images created by generative AI tools should not be used in manuscripts; however, non-generative AI-based images and generative AI-based images in some instances will be reviewed on a case-by-case basis, with proper disclosure of the use of generative AI to any extent.1 

Editors of the Journal of Product Innovation Management participated in a survey in March 2023. Although most disagreed with AI authorship, some still agreed to such authorships, while others were unsure.6 If other publications share similar perspectives, we might see some changes around authorship rules on generative AIs in the future. 

One significant concern about using AI tools in writing pertains to the confidentiality of the information provided in prompts. Most AI tools store the information found in prompts and use these as training material for the AI models. For this reason, some companies have banned the use of AI tools in their workplace.7 Only recently, OpenAI developed a policy to disable ChatGPT on getting trained by the information that the users input. "If you have classified information — something that you don't want people reading, do not stick it into one of these models," said Sil Hamilton, a machine learning engineer and AI researcher-in-residence at Hacks/Hackers, in a webinar organized by the Knight Center.8 

Some publishers like Elsevier also instruct editors not to upload any parts of a manuscript on generative AI tools as that can breach confidentiality, privacy, and proprietary rights.2 

Simply put, the AI language models are divided into two parts: knowledge and language bases. It's best to use the language base only. When people try to engage with the knowledge base, the models can produce wrong information. These are usually due to 'hallucinations'. "Hallucinations occur when the model stretches this knowledge a little bit past what it really fundamentally knows … it's when GPT tries to fill in the blanks of its knowledge when there are gaps in its knowledge, and it doesn't quite understand that something relates in one way or another way, and it tries to just come up with a way that they might relate," explained Hamilton. This can very often lead to the generation of false information, although the tool is not deliberately providing us with incorrect information.8,9 Such issues are best addressed by avoiding content creation using AI tools and, when used, by thoroughly fact-checking the AI-generated content.7 

Like text, generative AI tools can create pictures or videos. However, since AI tools have not yet reached perfection in image or video generation, it is easy to distinguish AI-generated pictures and videos from others. However, it is becoming increasingly challenging to identify AI-generated text. The views of the experts vary. Some are hopeful about finding ways to identify AI-generated content, while others opine that the more we advance into the future, the less the chances will be that current metrics would identify AI-generated content. Though it is not a completely infallible solution, watermarking AI-generated content has been on the minds of experts. It will remain an ever-evolving area and it might always be hard to distinguish AI-generated content.8,10 

Despite the negative stance on AI authorship and its use in content creation, most publishers allow authors to use AI tools to improve the language and brevity of the manuscript with proper disclosures. This is immensely helpful for non-native speakers of the language in which the paper is written and can often help reduce the word count. 

Other than just editing and refining, there can be several benefits of using AI tools. These models can generate thorough summaries once they are provided with a write-up. They can also help structure an outline specific to a type of publication or even create an example text for a particular kind of writing when authors are unfamiliar with such genres. They can even help translate pieces of text from one language to another.9,11 

Instead of just typing prompts on search bars, as in the case of ChatGPT, specific AI tools let us upload documents, too. We can upload PDF documents on ChatPDF or Anthropic and try to find information from the document. Such options are highly beneficial when searching for specific information in a research paper. Simply uploading the paper and asking the right questions can dramatically reduce the time spent on a research paper. 

Finally, the following tips could be used as a checklist when leveraging AI in academic writing:

  • Acknowledge AI Usage: When using AI tools like ChatGPT, clearly mention their usage in your manuscript as you would for any other software to prioritize transparency. 

  • Avoid AI Authorship: Due to accountability concerns, most journals do not accept AI as authors. 

  • Preserve Confidentiality: Refrain from inputting confidential information into AI models as they may store prompts for training. 

  • Fact-Check AI-Generated Content: Verify content created by AI tools and avoid content generation whenever possible. 

  • Enhance Language and Brevity: One of the best uses of AI tools is to improve language quality and flow of the text, especially helpful for non-native speakers. 

  • Leverage AI for Summaries: AI models can generate comprehensive summaries and help structure outlines for different writing genres. 

  • Document Search Using AI: Certain AI tools allow uploading documents for information retrieval. They can expedite research by analyzing PDFs. 

  • Stay Updated on Guidelines: As AI tools continue to evolve, stay informed about changing guidelines from publishers. 

References: 

1. Artificial Intelligence (AI) | Nature Portfolio. https://www.nature.com/nature-portfolio/editorial-policies/ai. 

2. Publishing ethics | Elsevier policy. beta.elsevier.com https://beta.elsevier.com/about/policies-and-standards/publishing-ethics. 

3. Flanagin, A., Kendall-Taylor, J. & Bibbins-Domingo, K. Guidance for Authors, Peer Reviewers, and Editors on Use of AI, Language Models, and Chatbots. JAMA 330, 702–703 (2023). 

4. MDPI's Updated Guidelines on Artificial Intelligence and Authorship. https://www.mdpi.com/about/announcements/5687. 

5. Robinson, M. Taylor & Francis Clarifies the Responsible use of AI Tools in Academic Content Creation. Taylor & Francis Newsroom https://newsroom.taylorandfrancisgroup.com/taylor-francis-clarifies-the-responsible-use-of-ai-tools-in-academic-content-creation/ (2023). 

6. From the Editors: Engaging with generative artificial intelligence technologies in innovation management research—Some answers and more questions. doi:10.1111/jpim.12689. 

7. Writing the rules in AI-assisted writing. Nat Mach Intell 5, 469–469 (2023). 

8. Generative AI: What journalists need to know about ChatGPT and other tools. (2023). 

9. The synergy of human expertise and AI: Maximizing academic writing efficiency and impact. Editage Insights https://www.editage.com/insights/the-synergy-of-human-expertise-and-ai-maximizing-academic-writing-efficiency-and-impact (2023). 

10. Tools such as ChatGPT threaten transparent science; here are our ground rules for their use. Nature 613, 612–612 (2023). 

11. Generative AI in Academic Writing. The Writing Center • University of North Carolina at Chapel Hill https://writingcenter.unc.edu/tips-and-tools/generative-ai-in-academic-writing/. 

 

 

Be the first to clap

for this article

Published on: Aug 31, 2023

Crafting engaging content about scientific research besides performing experiments on the bench
See more from Debraj Manna

Comments

You're looking to give wings to your academic career and publication journey. We like that!

Why don't we give you complete access! Create a free account and get unlimited access to all resources & a vibrant researcher community.

One click sign-in with your social accounts

1536 visitors saw this today and 1210 signed up.