Approval & Support

AI enablement and risk support

AI use at Waterloo is evaluated through a coordinated process involving Information Systems & Technology (IST), Legal & Immigration Services (LIS), departments, and Information Stewards. Together, these groups assess risks, ensure compliance, and provide guidance on the adoption of AI tools.

When University data is involved, departments and Information Stewards determine whether a tool or use case is appropriate, supported, or requires additional safeguards. The level of review depends on the data classification and the nature of the intended use. IST and LIS contribute to this evaluation by conducting risk and security assessments and providing privacy and contractual guidance.

Because AI use touches multiple governance areas—including data classification, intellectual property, privacy, security, and procurement—these considerations should be reviewed together when deciding whether and how an AI tool may be used with University data.

The below review process ensures that any AI tool used with University data is assessed for security, privacy, and contractual risks, whether the tool is standalone, embedded, or part of an AI-enabled system.

Initiation and scoping

Define the intended use of the tool, including:

  • The data classification involved (Policy 46)
  • Purpose and business or academic needs
  • User groups and access roles
  • Data storage, processing locations, and residency

Engage the appropriate Information Steward early to confirm data classification and assess whether the proposed use is appropriate.

Information Risk Assessment and Privacy Assessment (IRA/PIA)

Submit an intake request for an Information Risk Assessment (IRA) / Privacy Impact Assessment (PIA). The form will be submitted to IST's Information Security Services (ISS) and Legal & Immigration Services (LIS). These assessments evaluate:

  • Security controls, threat exposure, and vendor practices
  • Data retention, deletion, and subcontractor arrangements
  • Privacy risks, including how data is used for training or analytics
  • Compliance with institutional, legal, and contractual obligations

A PIA is required whenever personal or identifiable information is involved.

Contracting and procurement

If University data are used, Procurement & Contract Services will negotiate and establish a contract that includes:

  • Appropriate data protection and privacy terms
  • Security and compliance obligations
  • Restrictions on data use, training, and storage

Visit the Procurement & Contract Services site to learn more.

Approval and onboarding

Once required assessments and contracting steps are complete:

  • If the use case is limited, ISS and LIS will provide a risk assessment, and the requesting unit will determine approval or provide direction for limited use.
  • If this is an enterprise-level initiation, IST will provide guidance and documentation on their website.
  • Configuration and onboarding are completed to ensure secure use.

Learn more about the IRA/PIA processes.

Ongoing review and revalidation

Approved tools are reviewed periodically to ensure continued compliance. Revalidation occurs:

  • Every five years or upon contract renewal
  • When major vendor, feature, or data-handling changes occur

Tools may be reclassified or restricted if risk profiles change.

FAQs

What is an “Approved AI tool”?

An Approved tool:

  • has a University of Waterloo contract in place, and
  • has completed an Information Risk Assessment (IRA) and, where required, a Privacy Impact Assessment (PIA)

'Approved' tools may be used with Confidential and Restricted data, and with Highly Restricted data only with explicit Information Steward approval.

Tools that have an IRA/PIA but no contract are 'Reviewed for Limited Use' tools and may be used with data that has been made public by the source.

Tools with no review are 'Unreviewed' tools and not suitable for university-owned data that has not been made public by the source.

Tools identified as unsafe or high-risk are 'Problematic' tools.

What can I use Reviewed for Limited Use tools or Unreviewed tools for?

For 'Reviewed for Limited Use' tools, they are appropriate only for the specific use case for which the tool was reviewed or for data that has been made public by the source.

For 'Unreviewed' tools, the risks are unknown and they are not appropriate for university-owned data (i.e. confidential, restricted, or highly restricted data).

How do I classify my data?

Use Policy 46 confidentiality levels: 

  • Public
  • Confidential
  • Restricted
  • Highly Restricted

When uncertain, treat the information as Confidential until confirmed.

How is research data treated?

Apply the Research Data Risk Classification Framework to determine the sensitivity level of research data (low / medium / high / very high).

Ensure that safeguards are consistent with the Policy 46 confidentiality level and any ethics or sponsor requirements. The two frameworks align in their protection expectations, but their categories are not a one-to-one mapping

What is the approval path to make an AI tool “Approved”?

Information Steward consultation → IRA/PIA with IST & LIS → Procurement & Contract Services (contract) → IST security onboarding → Unit approval

What is the difference between Microsoft 365 Copilot and Copilot Chat?

  • Copilot Chat is the conversational interface that lets you interact with your UW Microsoft 365 data through chat-style prompts.
  • Microsoft 365 Copilot (a separately licensed product from Copilot Chat) integrates AI features directly into Word, Excel, Outlook, Teams, and other applications within the UW tenant.

Both operate within UW’s secure Microsoft environment and are approved for use when you are signed in with your UW credentials.

Can I use Microsoft 365 Copilot or Copilot Chat with University data?

Yes, when using your UW Microsoft 365 account.

  • Public and Confidential data: Permitted when the required data-use permissions have been obtained.
  • Restricted data: Permitted when the required data-use permissions have been obtained and the tool is used in accordance with University security and privacy requirements. 
  • Highly Restricted data: Not permitted unless explicit authorization has been granted for that specific data use. 

Do not use personal or consumer Microsoft accounts for University data.

Can I use ChatGPT Free, Plus, or Team (personal workspace)?

These tools are 'Reviewed for Limited Use' tools and may be used only with data that has been made public by the source.

Do not enter university-owned data that has not been made public (i.e. confidential, restricted, or highly restricted data).

Does paying for ChatGPT (e.g., Plus) make it approved?

No. Personal paid plans remain 'Reviewed for Limited Use' tools suitable for data that has been made public by the source unless: 

  • UW completes an IRA/PIA for the specific tool variant, and
  • UW establishes a contract with the vendor.

What about ChatGPT Enterprise or ChatGPT EDU?

If UW procures and completes an IRA/PIA: 

  • Confidential and some Restricted data may be permitted within the approved scope
  • Highly Restricted data remains prohibited unless explicitly approved by an Information Steward 

Scope depends on the configuration and contract.

May I paste de-identified samples into Reviewed for Limited Use tools or Unreviewed tools?

Only if: 

  • the data is Public, or
  • the information is fully de-identified and the Information Steward confirms that the re-identification risk is negligible.

Where do I ask for help?

  • IRA/PIA and tool review: Submit through the IST Help Portal or contact LIS (Privacy Office).
  • Research data questions: Contact the Office of Research or your research ethics board.
  • General guidance: Consult your Information Steward or local IT support.