hide
Free keywords:
-
Abstract:
Evidence synthesists are ultimately responsible for their evidence synthesis, including the decision to use artificial intelligence (AI) and automation, and to ensure adherence to legal and ethical standards.
Cochrane, the Campbell Collaboration, JBI, and the Collaboration for Environmental Evidence support the aims of the Responsible use of AI in evidence SynthEsis (RAISE) recommendations, which provide a framework for ensuring responsible use of AI and automation across all roles within the evidence synthesis ecosystem.
Evidence synthesists developing and publishing syntheses with Cochrane, the Campbell Collaboration, JBI, and the Collaboration for Environmental Evidence can use AI and automation as long as they can demonstrate that it will not compromise the methodological rigor or integrity of their synthesis.
AI and automation in evidence synthesis should be used with human oversight.
Any use of AI or automation that makes or suggests judgements should be fully and transparently reported in the evidence synthesis report.
AI tool developers should proactively ensure their AI systems or tools adhere to the RAISE recommendations so we have clear, transparent, and publicly available information to inform decisions about whether an AI system or tool could and should be used in evidence synthesis.