When a researcher is invited to review a colleague’s manuscript, they are expected to treat it as a strictly confidential document. Reviewers must not upload any portion of the submission into generative AI tools, as doing so risks violating the authors’ rights to confidentiality and intellectual property—and if the manuscript includes personally identifiable information, it could also infringe on data privacy regulations.
This confidentiality obligation applies equally to the peer review report. Because it may contain sensitive details about the work or its authors, reviewers should refrain from using AI tools to refine the language or improve readability of their feedback.
Peer review is a cornerstone of the scientific community, and Elsevier upholds the highest standards of integrity throughout the process. Evaluating a scientific manuscript is a human responsibility—one that calls for critical judgment and nuanced analysis. These essential aspects fall beyond the capabilities of current AI tools, which may produce flawed, incomplete, or biased assessments. Therefore, reviewers must rely on their own expertise and remain fully accountable for the content of their review.