Acta Natura et Scientia Generative AI Use Policy
Generative AI and AI-assisted technologies, such as large language models (LLMs), chatbots, and image generators, do not meet Acta Natura et Scientia’s authorship criteria. These tools cannot be credited as authors or co-authors, nor should citations attribute authorship to AI-generated sources.
Authors utilizing AI-assisted technologies for research, manuscript writing, or presentations must disclose this in the cover letter and acknowledgments section. The methods section should detail the AI tool used, its version, and the specific prompts employed. Authors bear full responsibility for the accuracy, originality, and integrity of their work. They must ensure appropriate citation and mitigate potential AI-induced biases. Editors reserve the right to reject manuscripts if AI use is deemed inappropriate.
This policy applies strictly to the writing process and does not cover AI tools used for data analysis or research insights.
Authors may use AI-assisted technologies solely for improving readability and language. AI use must remain under human oversight, with thorough review and editing, as AI-generated content may be incorrect, incomplete, or biased. Authors are fully responsible for their work’s final version.
AI use in manuscript preparation must be disclosed and will be stated in the published work to ensure transparency and trust among authors, readers, reviewers, and editors. AI tools must not be listed as authors, as authorship entails human responsibilities, including ensuring integrity, approving final versions, and adhering to ethical publishing standards.
LLMs such as ChatGPT do not meet authorship criteria because accountability for content cannot be attributed to AI. If an AI tool is used, this should be documented in the Methods section. AI-assisted copy editing for grammar, spelling, and readability improvements does not require disclosure. However, AI must not be used for autonomous content creation or editorial work. Human authors remain fully accountable for all text revisions.
This policy will be continuously monitored and adjusted as necessary to reflect advancements in AI and publishing ethics.
Generative AI or AI-assisted tools must not be used to create, modify, or enhance images in submitted manuscripts. This includes altering specific features or introducing elements. Brightness, contrast, and color balance adjustments are acceptable as long as they do not obscure or remove original information. Image forensic tools may be used to detect inconsistencies.
An exception applies if AI is integral to the research design, such as AI-assisted imaging in biomedical research. In such cases, AI usage must be transparently described in the Methods section, specifying the tool, version, and application details. Authors may be required to provide pre-adjusted images for editorial assessment.
Generative AI is not permitted for artwork production, including graphical abstracts and journal cover art, unless prior editorial approval is granted. Legal, ethical, and copyright considerations must be addressed in such cases.
Given the novel copyright and integrity challenges posed by AI-generated imagery, Acta Natura et Scientia adheres to current legal and ethical publication standards. This policy will be reviewed and updated as necessary.
Authors must explicitly declare AI use in their ethical disclosures. Examples include:
The authors confirm that no generative AI was used in writing this manuscript or creating images, tables, or graphics.
AI-assisted technology was not used in the preparation of this work, except for grammar and spelling checks.
For further reference, authors should consult guidelines from COPE.
Peer reviewers must treat manuscripts as confidential. Uploading manuscripts or portions of them to AI tools is strictly prohibited, as this may breach confidentiality and data privacy rights.
Likewise, peer review reports should not be processed through AI tools, even for language refinement. Reviewers are responsible for the originality and accuracy of their evaluations. Generative AI should not be used to analyze or critique manuscripts, as scientific judgment requires human expertise. If AI-assisted tools contribute to a review, this must be transparently disclosed in the peer review report.
Acta Natura et Scientia may use AI tools internally for tasks such as plagiarism detection and reviewer matching. These tools comply with ethical guidelines and data privacy standards.
Editors must maintain the confidentiality of submitted manuscripts and related correspondence. Uploading manuscripts or editorial communications into AI tools is prohibited due to potential breaches of confidentiality and proprietary rights.
Manuscript evaluation and decision-making require human judgment, and generative AI must not be used in these processes. Editors bear full responsibility for their decisions and communications with authors.
If editors suspect policy violations regarding AI use, they should notify the publisher. Acta Natura et Scientia employs AI-assisted technologies in compliance with ethical and legal standards for plagiarism checks and editorial efficiency.
This policy will be updated as AI capabilities evolve, ensuring responsible and ethical integration of AI in scientific publishing.