Generative AI and AI-assisted tools ("AI Tools") can enhance research efficiency by helping authors synthesise literature, explore research trends, identify gaps, and improve writing clarity. While these technologies can support manuscript preparation, their use must remain ethical and transparent. AI Tools cannot replace human expertise, judgement, or responsibility. IJCB Authors are fully accountable for their work and must:
Responsible Use of AI Tools
Authors must review the terms and conditions of any AI Tool to protect the confidentiality of their data, including unpublished manuscripts. Personally identifiable or sensitive information should never be shared with such systems. AI-generated images must not imitate copyrighted works, real individuals, or branded products. Authors are responsible for validating all generated content and assessing it for bias or inaccuracies.
AI Tools should only access user data for providing the requested service. Authors must ensure that no rights are granted to use their materials for AI training and that no publication restrictions are imposed on AI-generated outputs.
Disclosure of AI Use
All AI involvement in manuscript preparation must be disclosed in a dedicated statement at submission, which will also appear in the published paper. This statement should identify the AI Tool, describe how it was used, and clarify the level of human oversight. Transparent reporting promotes trust and compliance with ethical publishing standards.
Routine grammar or spelling checks do not require disclosure, but AI use during research must be described in the methods section.
Authorship Responsibilities
AI Tools cannot be listed or cited as authors. Authorship requires accountability, approval of the final version, and consent for submission-duties that only humans can perform. Authors must ensure originality, respect for third-party rights, and adherence to ethical standards. All listed authors should meet authorship criteria and accept responsibility for the integrity of the work.