Policy on the use of generative AI

The journal supports the transparent and responsible use of Generative Artificial Intelligence (GenAI) tools in scientific research and publications.

The use of such tools is permitted provided that:

  • the principles of academic integrity are strictly observed;
  • the use of AI is disclosed transparently;
  • authors retain full responsibility for the research results.

In cases where generative AI has been used, authors are required to disclose this in the manuscript.

To ensure transparency, the journal recommends the use of GAIDeT (Generative AI Delegation Taxonomy) — an approach that clearly records tasks delegated to generative AI while maintaining full human responsibility for the outcomes.

The declaration must include:

  • identification of the AI tool used (name, version);
  • description of tasks delegated to AI;
  • confirmation that the authors remain fully responsible for the final results.

The declaration must be placed in the manuscript before the References section.

The journal recommends using the GAIDeT Declaration Generator for standardized reporting:
https://panbibliotekar.github.io/gaidet-declaration/

It is also recommended to cite the following publication:
Suchikova, Y., Tsybuliak, N., Teixeira da Silva, J. A., & Nazarovets, S. (2025). GAIDeT (Generative AI Delegation Taxonomy): A taxonomy for humans to delegate tasks to generative artificial intelligence in scientific research and publishing. Accountability in Research.
https://doi.org/10.1080/08989621.2025.2544331

Guidelines for authors and editors on transparent disclosure of AI contributions (GAIDeT):
https://doi.org/10.5281/zenodo.16941301

 

Example of a declaration:

Authors declare the use of generative AI in the research and writing process. According to the GAIDeT taxonomy (2025), the following tasks were delegated to GenAI tools under full human supervision: literature search and systematization; data analysis; translation; ethical risk analysis.

The GenAI tool used: ChatGPT-5.

The authors take full responsibility for the final version of the manuscript. GenAI tools are not listed as authors and do not bear responsibility for the final outcomes.

Generative AI tools:

  • cannot be listed as co-authors;
  • cannot be held responsible for the content of publications;
  • cannot replace scientific interpretation of results.

The use of AI does not exempt authors from responsibility for:

  • data accuracy;
  • validity of conclusions;
  • compliance with ethical standards.

Peer review must be conducted exclusively by human experts.

The use of generative AI for preparing peer reviews is not permitted because:

  • it may compromise confidentiality;
  • it reduces expert responsibility;
  • it does not ensure adequate scientific evaluation.

The journal considers generative AI a supportive research tool rather than an independent research entity.

Its use must be:

  • transparent;
  • human-controlled;
  • non-substitutive of authorship contribution;
  • free from risks to the reliability of scientific results.