Purpose

This document outlines the guidelines necessary to ensure students in the School of Rehabilitation Science (SRS) comply with established expectations for the responsible, ethical, effective, and safe use of Artificial Intelligence (AI) technologies during clinical placements. While AI may be used to enhance learning, it must be used in a way that protects patient privacy and complies with ethics of the relevant health profession. Importantly, the goal of clinical placements is to develop clinical reasoning, problem solving, and decision-making skills and not rely on AI for this function. AI must never replace student clinical reasoning. 

Guiding Principles

  • Patient-Centered Care: AI technologies should only be used to enhance care quality, respecting patient well-being, preferences, and values 
  • Ethical Standards: Use of AI must adhere to professional ethics, including autonomy, beneficence, nonmaleficence, justice, and consent
  • Accountability: Students are expected to communicate with and be accountable to their Clinical Preceptor(s) for all AI use   
  • Data Privacy and Security: No patient identifiers or confidential data should be entered into AI tools. Students must comply with privacy laws and review the AI tool’s terms of use 
  • Bias and Fairness: Students must critically evaluate AI outputs for bias, inaccuracies, and lack of transparency 
  • Clinical Oversight: AI should support, not replace, clinical judgment. All AI-generated content must be reviewed and validated by the student before using it 
  • Consent: If AI is used in any way that involves patient-related information, explicit consent must be obtained and documented 

AI tools may be used for idea generation or writing assistance, provided outputs are critically reviewed and comply with professional standards. 

AI Use in Patient Documentation/Charting

AI scribe technology may be used for charting only if approved by the Clinical Preceptor, and the student verifies accuracy before submission to the permanent medical record. 

Student Responsibilities

  • Ensure compliance with SRS and clinical site policies and/or guidelines, professional standards, and privacy legislation 
  • Ensure no confidential patient information is entered into AI tools
  • Critically evaluate AI-generated content for accuracy, bias, and appropriateness
  • Obtain patient consent when applicable 
  • Remain accountable to the Clinical Preceptor for all AI use   
  • Maintain transparency and integrity in all academic and clinical work 

Review and Revision

This policy will be reviewed annually and updated as needed to reflect advancements in AI technology and clinical practice standards. 

Definitions

  • Artificial Intelligence (AI): Technology designed to simulate human-like intelligence and decision-making (e.g., ChatGPT, Gemini, Claude, Microsoft Copilot) 
  • Artificial General Intelligence (AGI): AI capable of learning and applying knowledge across diverse tasks, similar to human cognition