Q: I have used ChatGPT to polish the English language quality of my paper. Do I need to mention this to the journal?

2 Answers to this question
Answer:

Thank you for this critical question. 

While using AI tools like ChatGPT to polish the English language quality of a manuscript is allowed by most journals, and you should always inform the journal about this, you should be cautious in working with such tools. Generative AI tools can “make up” information due to a process called “hallucination” — this happens when the tool doesn’t understand how two pieces of information relate. It fills in the gap by generating information to make the two ends meet without being sure about the accuracy. This can lead to the incorporation of misleading facts. Hence, you should always carefully review all the content that has been edited using AI tools before submitting your manuscript to a journal. 

With the advent of various AI tools, understanding to what extent we can use them in academic publications often needs to be clarified. AI tools cannot be held accountable, so they cannot be provided authorship in manuscripts. Another issue with these tools is maintaining the confidentiality of the information the tool is supplied with. Some tools store input data for their training. This could be a severe concern in specific contexts. You can check an article here to learn more about journal guidelines for using AI tools. 

In conclusion, the judicious use of specific AI tools like ChatGPT can be beneficial for several reasons. You can check for grammar and brevity and even reduce the word count according to the journal’s submission guidelines. These can often be useful, especially for non-native speakers of a language. It should, however, be the authors' responsibility to mandatorily disclose such usage to the journal during the manuscript submission to ensure transparency at your end. 

Answer:

Thank you for this critical question. 

While using AI tools like ChatGPT to polish the English language quality of a manuscript is allowed by most journals, and you should always inform the journal about this, you should be cautious in working with such tools. Generative AI tools can “make up” information due to a process called “hallucination” — this happens when the tool doesn’t understand how two pieces of information relate. It fills in the gap by generating information to make the two ends meet without being sure about the accuracy. This can lead to the incorporation of misleading facts. Hence, you should always carefully review all the content that has been edited using AI tools before submitting your manuscript to a journal. 

With the advent of various AI tools, understanding to what extent we can use them in academic publications often needs to be clarified. AI tools cannot be held accountable, so they cannot be provided authorship in manuscripts. Another issue with these tools is maintaining the confidentiality of the information the tool is supplied with. Some tools store input data for their training. This could be a severe concern in specific contexts. You can check an article here to learn more about journal guidelines for using AI tools. 

In conclusion, the judicious use of specific AI tools like ChatGPT can be beneficial for several reasons. You can check for grammar and brevity and even reduce the word count according to the journal’s submission guidelines. These can often be useful, especially for non-native speakers of a language. It should, however, be the authors' responsibility to mandatorily disclose such usage to the journal during the manuscript submission to ensure transparency at your end.