Generative AI Tools

AI Tools Within and Beyond U of T’s Protected Environment 

When deciding whether and how to integrate generative AI into classroom environments, it is valuable to consider which tools are institutionally-supported at the University of Toronto. 

The below resource differentiates between two categories of generative AI tools:

  1. Inside the Walled Garden: Generative AI tools that operate within U of T’s protected digital environment, ensuring institutional support and compliance with university policies.
  2. Outside the Walled Garden: Generative AI tools that fall beyond the University’s officially supported toolkit, which may require additional consideration before use.

Within the Walled Garden

A stylized illustration of a garden scene with a tree, plants, a fenced-in area, and birds flying in the sky with clouds.
Protected Environment Supported by U of T

Outside the Walled Garden

Open double doors revealing a serene mountain landscape with a sunrise, birds in the sky, and clouds.
Applications Beyond the University’s Supported Toolkit
 

Features:

  • Guardrails that prioritize security, privacy and vendor/provider accountability 
  • Once logged into your U of T account, the protected version of Microsoft Copilot does not collect information from users’ prompts for training purposes
  • Ease of use/access due to institutional support, configuration for teaching and learning context
  • Equitable access for all students without user subscription fees
  • Greater predictability of services provided

Features:

  • Most recent developments available from range of platform providers
  • Access to discipline and task specific tools and resources
  • Additional customization possible for innovative teaching with individual instructor investment in build and support
  • Potential for embedding within other tools/platform contexts
  • Enhanced or specialized services on a subscription basis

Risks and Concerns:

  • Limited to platform and configuration options provided by institutionally supported and/or vetted options
  • Reduced opportunity for customization and integration

Risks and Concerns:

  • Content and data privacy concerns in sharing with platform/provider for model training/third party use
  • Students must be provided with an alternative to using a platform that is not one of the University of Toronto’s supported tools
  • Fee for service limits equitable access
  • Unsupported tools may disappear or have terms of service or functionality changed during the term.
  • Instructor responsible for support, including compliance with all U of T policies and practices on appropriate use of information technology and information privacy and security risks

 

Example: Microsoft Copilot

 

Example: ChatGPT | Open AI services
Visit the Other Tools tab to
read more about using ChatGPT to build course chat bot


This content is available for download for use in a range of instructor support contexts.

This work is licensed under a Creative Commons BY-NC-SA 4.0 International License.

Current Approved Tools

Microsoft Copilot

Currently, Microsoft Copilot is the recommended generative AI tool to use at U of T. When a user signs in using University credentials, Microsoft Copilot conforms to U of T’s privacy and security standards (i.e., does not share any data with Microsoft or any other company) for use with up to level 3 data.

It is also free to use. Microsoft Copilot uses OpenAI’s GPT-4 model and performs comparably to ChatGPT. For more information about Copilot, refer to our Copilot Tool Guide. 

Microsoft Copilot is a general-purpose chatbot. With thoughtful prompting, it can be used for many teaching and learning purposes (e.g., see Ethan and Lilach Mollick’s More Useful Things site for a comprehensive prompt library). However, many other educational generative AI chatbots and tools are being developed that incorporate pre-defined prompts out of the box. 

Contact North

Contact North is a not-for-profit, Government of Ontario-supported organization that has developed two generative AI teaching and learning tools (also based on OpenAI’s GPT-4 model), both of which conform to U of T’s privacy and security standards for use with level one and level two data: 

  • AI Tutor Pro is a resource for students to build and test their knowledge 

Other tools that conform to U of T’s privacy and security standards will be referenced here as they become available

Artificial Intelligence Virtual Tutor Initiative

The Artificial Intelligence Virtual Tutor Initiative includes faculty development activities to engage instructors in the development and administration of a Virtual Tutor in the form of a chatbot. The Virtual Tutor chatbots created as part of this program have been tailored to course materials leveraging Generative Artificial Intelligence (GenAI) to provide responses to student questions to support teaching and learning in courses offered in the fall and winter terms of 2024-2025.

Participating instructors are engaging with support staff and a cohort of peers during the design, deployment and evaluation phases of their project to extend our shared understanding of this new domain.  

Features of the Virtual Tutor Initiative include:    A computer with the fact of a person on it
  • Participation in a virtual orientation session that includes content preparation training.
  • Interim checkpoint surveys and a final project report describing the challenges and opportunities observed during project implementation.
  • Participation in focus group and evaluation activities to provide insight on experiences that will inform institutional planning. 
  • Availability of infrastructure and staff support resources for each project team’s Virtual Tutor. 
The cohort leads for the initiative include: 
  • Michelle Arnot, Professor, Teaching, Department of Pharmacology and Toxicology, Temerty Faculty of Medicine 
  • Charlene Chu, Assistant Professor, Lawrence Bloomberg Faculty of Nursing 
  • Joseanne Cudjoe, Assistant Professor, Teaching Stream, Department of Arts, Culture and Media, University of Toronto Scarborough 
  • Emily Ho, Assistant Professor, Teaching Stream, Department of Occupational Science and Occupational Therapy, Temerty Faculty of Medicine 
  • Nohjin Kee, Associate Professor, Teaching Stream, Department of Physiology, Temerty Faculty of Medicine 


This initiative is sponsored by the Centre for Teaching Support & Innovation (CTSI) and Information Technology Services (ITS). 

If you have any questions, contact: ctsi.teaching@utoronto.ca 

Engaging with Non-Supported Generative AI Tools

Introduction

The University of Toronto recognizes the growing interest among instructors to engage with academic uses of generative AI tools other than those currently supported or vetted by the University. Instructors may be eager to experiment with tools from OpenAI, Anthropic, Google, and others that are customized for specific teaching applications, such as chatbots grounded in course content. The aim of this guide is to outline considerations when engaging with tools that fall outside of the University’s “walled garden” of institutional support, including maximizing data privacy and security, as well as respecting intellectual property and copyright. The precautions outlined in the Tools Beyond Quercus section of the CTSI website remains a relevant framework to consider risks and implications of engaging with non-supported tools, and this guide highlights some key points to consider for generative AI tools specifically.  

Considerations

Rationale and Pedagogical Value 
  • Clarify how the non-supported AI tool supports learning outcomes or instructional strategies that cannot be achieved using supported tools alone (i.e., Microsoft Copilot).
  • For custom course chatbots, consider how your “system prompt” will allow you to customize the instructional strategy and student experience. For example, if you want a custom chatbot that will act like an expert tutor to help students build their conceptual knowledge, consider adapting Lilach and Ethan Mollick’s “Updated Tutoring Prompt”. 
Cost and Sustainability 
  • The University does not directly cover the costs associated with the use of non-supported tools. However, faculty members may consider using funds such as the Professional Expense Reimbursement Allowance (PERA) 
  • Since tool functionality and availability is changing rapidly, consider the impact to your course design should the cost no longer be sustainable, or should the tool no longer be available.  
  • Students cannot be required to use or pay for non-supported tools.   
  • Institutional educational technology staff do not provide end-user support for non-supported tools. Instructors are responsible for supporting such tools that they choose to include in their course, including training and support for TAs and students.   
Equity, Transparency, and Accessibility 
  • Students cannot be compelled to create accounts on non-University systems or services.  
  • Non-supported tools that are to be used by students as part of course activities must be explicitly listed on the course syllabus: 
    • Clarify whether students are required to register or create an account, and let students know not use the same password as the one they use to access UTORid-enabled services. 
    • Provide students with the context for what they are expected to do using the non-supported tool (e.g., for use as an optional, supplementary course tutor).  
    • Provide links to vendor/product statements regarding privacy and use of information, including the end-user license agreement (EULA), if applicable.  
    • If a non-supported tool is used as part of a required course activity or assessment, a viable alternative must be made available for students who do not consent to participate.  
    • Investigate wither the tool presents any accessibility barriers to students with disabilities. Consider consulting with U of T’s Accessibility for Ontarians with Disabilities Act (AODA) Office to assess whether the tool will meet your pedagogical goals for all students.  
Privacy and Security 
  • Do not share personal information or other data that resides within levels 2 to 4 of the Data Classification Standard with generative AI systems unless that sharing has been specifically authorized. 
  • Investigate if and how user data will be used by the tool’s supplier (e.g. for training AI models, for marketing, or shared with any third-party vendors)  
  • If applicable, provide instructions for how students can engage with the tool without sharing their interaction history. For example, if you are using OpenAI’s GPT Builder, show students how to engage with ChatGPT using its temporary chat mode, and how to opt out of sharing conversation data completely 
  • Ensure that you have reviewed and are aware of the implications of the tool’s EULA. Consider the following questions:  
    • Is personal information shared and with whom? 
    • What is the privacy policy? 
    • Where is data stored? 
  • Remind students to avoid including any personal details in prompts.  
  • Refer to the U of T Information Security department’s “Use artificial intelligence intelligently” resource for more information.  
Content Preparation 
  • If you are uploading course content or other documents to a generative AI tool (e.g., for a custom course chatbot), the content should be extensive, accurate, and well-organized. High-quality training material ensures that the tool can generate responses that are as relevant and accurate as possible.  
  • For detailed guidance on content preparation, refer to CTSI’s “Preparing Content for Custom AI Chatbots” resource.  
Copyright and Intellectual Property 
  • Do not include any copyrighted material (including full-text library licensed e-resources) without authorization. 
  • For custom AI tools that have a share link for student use, assume that they can potentially be accessed by any user, with query results exposing your intellectual property outside of the University. Consider whether this aligns with your personal view on sharing course or domain related materials. 
  • See the U of T Library’s Generative AI tools and Copyright Considerations resource for up-to-date guidelines on copyright and intellectual property.   
Testing for Accuracy, Consistency and Bias 
  • Although uploading curated content into a custom AI tool will generally reduce the rate of factually inaccurate outputs (“hallucinations”), AI-generated text and code are not guaranteed to be correct.  
  • Test whether the tool can grasp the context of questions, understanding linguistic nuances and domain-specific elements.  
  • Test the tool with real-world questions and scenarios that students might encounter. Assess the responses and determine if they are within acceptable standards for your use case. This helps to identify the tool’s ability to provide accurate and contextually appropriate responses. Consider testing using the prompts included in CTSI’s “AI Virtual Tutor – Effective Prompting Strategies” resource.  
  • Consider testing how the tool responds to the same topic/question when phrased slightly differently. This will help assess the consistency of the responses.  
  • Through your testing, assess potential biases in the AI tool relevant to your field and plan to address them – e.g., by uploading additional content or adjusting your system prompt.  
  • Try some prompts that purposely attempt to “trick” the tool into providing inappropriate responses, “hallucinations”, or going off topic to see how it responds. 
  • Some tools may have a setting to enable content moderation/filtering for internet-enabled searches (e.g., block profanity, hate language, etc.). 
  • Students should be reminded that AI tools should not be treated as an “authoritative source” – for example, students should continue to refer to the course syllabus directly for key dates and course policies.  

Example – Creating a Virtual Tutor using GPT Builder

The following is a step-by-step guide on creating and configuring a custom AI chatbot that acts as a virtual tutor using OpenAI’s GPT Builder:  

Step 1: Access GPT Builder 
  • Create or log in to your ChatGPT Plus account at https://chat.openai.com. 
  • In the left sidebar, click on “Explore GPTs”. 
  • Under “My GPTs”, click the “Create a GPT” button. 
Step 2: Provide Initial Instructions 
  • In the “Create” tab of GPT Builder, it will ask you “What would you like to make?” – this will initiate a dialogue to determine specifics about your GPT’s purpose and behavior based on your initial prompt.
  • For example, your initial prompt might be: “Create an expert tutor to help students in ECO101 understand basic microeconomic concepts”. 
  • GPT Builder will then respond with specific questions about the name, logo, and purpose of the GPT. Iterating several times with the GPT Builder through natural conversation is recommended to provide the model with a clear understanding of what it should be doing.   
Step 3: Refine Your Instructions 
  • Switch to the “Configure” tab for more advanced customization options.  
  • Under “Instructions”, review the virtual tutor instructions that were generated by your dialog with the GPT Builder to ensure that it is consistent with your instructional objectives.  
  • Consider adapting Lilach and Ethan Mollick’s “Updated Tutoring Prompt” 
  • You may also want to add to the instructions; e.g., “Do not answer any questions or engage in dialogue about any topic other than…” 
Step 4: Upload Knowledge Files 
  • Under “Knowledge”, you can upload up to 20 files containing information you want your GPT to reference.  
  • Do not include any copyrighted material (including full-text library licensed e-resources) without authorization. 
  • Refer to the U of T Library’s Generative AI tools and Copyright Considerations for more information about copyright.   
  • Do not include material that contains any sensitive personal information about individuals.  
  • In the “Instructions” section of GPT Builder, include “When referencing materials from the “Knowledge” section, provide a full citation to ensure proper attribution.”  
Step 5: Enable Relevant Capabilities with Caution 
  • In the “Capabilities” section, you have the option to enable “Web Browsing”, “DALL·E Image Generation”, and “Code Interpreter & Data Analysis”. 
  • Disabling “Web Browsing” will prevent the virtual tutor from accessing up-to-date information on the internet that is not included in its training data or your uploaded content. This may or may not be desirable, depending on your objective.  
  • Be aware the code interpretation capability will allow users to download your uploaded content in its native format.  
Step 6: Configure Data Sharing Settings 
  • In the “Configure” tab of GPT Builder, scroll down to “Additional Settings” 
  • You may uncheck the box for “Use conversation data in your GPT to improve our models”. This will prevent the content of user conversations with your virtual tutor from being used by OpenAI for model training and improvement.  
  • It is important to note that the privacy settings of your virtual tutor users will take priority over the setting above: if a user chooses to share their chat history to improve model training and development, this will expose their own chat history with your virtual tutor regardless of the setting you choose.   
Step 7: Test and Refine 
  • Use the “Preview” panel to test out your virtual tutor and see how it responds. 
  • Iteratively update your instructions and settings based on its performance. 
  • Monitor for potential biases or inaccuracies and refine your instructions or uploaded files if necessary. For example, you might notice through testing your virtual tutor that a lecture transcript you uploaded contains errors.  
Step 8: Publish Your Virtual Tutor 
  • Once finalized, select the “Save” button and choose your sharing settings. 
  • Selecting the “Anyone with the link” option will allow students to access your chatbot but won’t enable public browsing.  
  • Selecting “GPT Store” will publish your virtual tutor to OpenAI’s public “Explore GPTs” page.  
  • Select “Confirm” to complete the publishing process.

How Generative AI Works

Are you interested in learning how generative AI works? A good introduction is provided by the Schwartz Reisman Institute for Technology and Society’s “What are LLMs and generative AI? A beginners’s guide to the technology turning heads”. You could also consider: 

The responses you receive from a generative AI tool depend on the prompts you enter, and the further refining of these prompts, which takes practice. As Ethan Mollick said, “The lesson is that just using AI will teach you how to use AI.” (Working with AI: Two Paths to Prompting) To get started, we recommend you consider the following. 

Be clear about what you want. Include detailed information in your prompt, including the desired format. “Write a paragraph about…” “Create an image containing…” Suggest a particular style (e.g., an academic essay or lab report) and include specific information you want to include (e.g., provide an outline or ordered steps for the prompt).    

  • If you’re not sure how to describe the style you want to emulate, the Wharton School at the University of Pennsylvania suggests pasting in a text example you like and asking the tool to describe the style Use that description in your own prompt for style.   
  • To learn more on prompt writing, see our Tool Guide under “How can I prompt with Copilot?.”

Be critical.

  • Does the tool output meet your needs? What additional information is required? Generative AI is an interactive tool. Try different options and prompts to gauge the results, clear the prompt screen and try again. You will learn to refine your prompts and better discern what is most effective with practice.   
  • Generative AI tools can provide quick results that may appear correct, but looks can be deceiving. Tools such as ChatGPT can produce hallucinations or misleading and factually incorrect text. As with any text or visual analysis, we need to examine the results with a critical eye.
Back to Top