draft-genai-tools-sessions-examples
Note: The elements in this draft are not fully checked or sanitized. Please be mindful when reusing.
Recording: Previewing Cogniti: Creating AI Virtual Tutors for Your Courses
On June 19, 2025, the Academic, Research & Collaborative Technologies team (ARC) hosted an online session introducing Cogniti. The session demonstrated Cogniti's capabilities and Quercus integration for instructors interested in AI tutoring.
Watch the Session Recording | Download the Presentation Slides
AI Tools Within and Beyond U of T’s Protected Environment
When deciding whether and how to integrate generative AI into classroom environments, it is valuable to consider which tools are institutionally-supported at the University of Toronto.
The below resource differentiates between two categories of generative AI tools:
- Inside the Walled Garden: Generative AI tools that operate within U of T’s protected digital environment, ensuring institutional support and compliance with university policies.
- Outside the Walled Garden: Generative AI tools that fall beyond the University’s officially supported toolkit, which may require additional consideration before use.
Within the Walled Garden

Outside the Walled Garden

Features:
- Guardrails that prioritize security, privacy and vendor/provider accountability
- Once logged into your U of T account, the protected version of Microsoft Copilot does not collect information from users’ prompts for training purposes
- Ease of use/access due to institutional support, configuration for teaching and learning context
- Equitable access for all students without user subscription fees
- Greater predictability of services provided
Features:
- Most recent developments available from range of platform providers
- Access to discipline and task specific tools and resources
- Additional customization possible for innovative teaching with individual instructor investment in build and support
- Potential for embedding within other tools/platform contexts
- Enhanced or specialized services on a subscription basis
Risks and Concerns:
- Limited to platform and configuration options provided by institutionally supported and/or vetted options
- Reduced opportunity for customization and integration
Risks and Concerns:
- Content and data privacy concerns in sharing with platform/provider for model training/third party use
- Students must be provided with an alternative to using a platform that is not one of the University of Toronto’s supported tools
- Fee for service limits equitable access
- Unsupported tools may disappear or have terms of service or functionality changed during the term.
- Instructor responsible for support, including compliance with all U of T policies and practices on appropriate use of information technology and information privacy and security risks
Example: ChatGPT | Open AI services
Visit the Other Tools tab to
read more about using ChatGPT to build course chat bot
This content is available for PDF download for use in a range of instructor support contexts.
Current Approved Tools
Microsoft Copilot
Currently, Microsoft Copilot is the recommended generative AI tool to use at U of T. When a user signs in using University credentials, Microsoft Copilot conforms to U of T’s privacy and security standards (i.e., does not share any data with Microsoft or any other company) for use with up to level 3 data.
It is also free to use. Microsoft Copilot uses OpenAI’s GPT-4 model and performs comparably to ChatGPT. For more information, refer to our Copilot Tool Guide.
Microsoft Copilot is a general-purpose chatbot. With thoughtful prompting, it can be used for many teaching and learning purposes (e.g., see Ethan and Lilach Mollick’s More Useful Things site for a comprehensive prompt library). However, many other educational generative AI chatbots and tools are being developed that incorporate pre-defined prompts out of the box.
ChatGPT Edu
ChatGPT Edu is an advanced version of OpenAI’s ChatGPT, offering enhanced security and privacy protections. Available to the University of Toronto through an enterprise agreement with OpenAI, the contract ensures UofT data is stored safely and not used for training OpenAI models. Users will have access to the latest model of ChatGPT, including features like Advanced Data Analysis, GPT Builder tool, DALL·E image creation/editing.
Please note that ChatGPT Edu is a separate paid service, and that free ChatGPT is NOT in protected mode. Ensure you have confirmed paid access from the Licensed Software Office (LSO) to access the U of T ChatGPT Edu version of ChatGPT.
For more information, refer to our ChatGPT Edu Tool Guide.
Contact North
Contact North is a not-for-profit, Government of Ontario-supported organization that has developed two generative AI teaching and learning tools (also based on OpenAI’s GPT-4 model), both of which conform to U of T’s privacy and security standards for use with level one and level two data:
- AI Teacher’s Assistant Pro guides instructors though AI-assisted syllabus and teaching resource creation
- AI Tutor Pro is a resource for students to build and test their knowledge
M365 Copilot
Microsoft Copilot is a brand of generative artificial intelligence developed by Microsoft, currently based on OpenAI technology. Within Microsoft 365 (M365) applications, M365 Copilot is a virtual agent that can generate contextual content when prompted, based in part on the documents in OneDrive and SharePoint to which the user has access.
Please note that the M365 Copilot version of Copilot is different from the Protected Edition of Copilot Search and requires an additional fee-for-service licence.
Teams Premium
Teams Premium offers an AI meeting assistant for staff and instructors. Those interested in using this resource may submit a request via the ESC.
For more information, please read this overview from Microsoft: How does Teams Premium compare to Teams?
Azure AI Foundry
Azure AI Foundry is Microsoft’s integrated platform designed to streamline the development, deployment, and management of AI applications. Azure AI Foundry can be deployed as part of an AI landing zone, which includes all the tools and services needed to develop, train, and deploy AI models.
For more information, please read this overview from Microsoft: What is Azure AI Foundry?
Web of Science (WoS) Research Assistant Tool
The University of Toronto Libraries has licensed the Web of Science (WoS) Research Assistant Tool, as an add-on. It has been enabled in the UofT subscription account for the University community access.
To use the tool, navigate to the Web of Science database and select the Research Assistant tab from the top menu bar. The Research Assistant (RA), a generative-AI-powered tool based on the corpus of the Web of Science Database, can retrieve articles, organize content, and formulate responses to questions based on the corpus within WoS—including references to the source material. The tool also offers visualizations and guided tasks
Scopus AI
The University of Toronto Libraries has subscribed to Scopus AI, enabling access for the University community as part of its Scopus database subscription. Scopus AI is an intuitive and intelligent search tool powered by generative AI that provides enhanced search results from the Elsevier catalogue for materials from 2013 onward.
It can cite referenced texts, provide article summaries, suggest “go deeper” questions, generate concept maps, and more. Scopus AI is designed to help researchers efficiently find relevant information, explore topics, and facilitate effective collaborations across disciplines
Other tools that conform to U of T’s privacy and security standards will be referenced here as they become available.
Artificial Intelligence Virtual Tutor Pilot Initiative
The Artificial Intelligence Virtual Tutor Pilot Initiative includes faculty development activities to engage instructors in the development and administration of a Virtual Tutor in the form of a chatbot. The Virtual Tutor chatbots created as part of this program have been tailored to course materials leveraging Generative Artificial Intelligence (GenAI) to provide responses to student questions to support teaching and learning in courses offered in 2024-2025.
Participating instructors are engaging with support staff and a cohort of peers during the design, deployment and evaluation phases of their project to extend our shared understanding of this new domain.
Features of the Virtual Tutor Pilot Initiative include: 
- Participation in a virtual orientation session that includes content preparation training.
- Interim checkpoint surveys and a final project report describing the challenges and opportunities observed during project implementation.
- Participation in focus group and evaluation activities to provide insight on experiences that will inform institutional planning.
- Availability of infrastructure and staff support resources for each project team’s Virtual Tutor.
The cohort leads for the Virtual Tutor Pilot Initiative include:
- Michelle Arnot, Professor, Teaching, Department of Pharmacology and Toxicology, Temerty Faculty of Medicine
- Charlene Chu, Assistant Professor, Lawrence Bloomberg Faculty of Nursing
- Joseanne Cudjoe, Assistant Professor, Teaching Stream, Department of Arts, Culture and Media, University of Toronto Scarborough
- Emily Ho, Assistant Professor, Teaching Stream, Department of Occupational Science and Occupational Therapy, Temerty Faculty of Medicine
- Nohjin Kee, Associate Professor, Teaching Stream, Department of Physiology, Temerty Faculty of Medicine
This initiative is sponsored by the Centre for Teaching Support & Innovation (CTSI) and Information Technology Services (ITS).
If you have any questions, contact: ctsi.teaching@utoronto.ca
^ back to top
Engaging with Non-Supported Generative AI Tools
Introduction
The University of Toronto recognizes the growing interest among instructors to engage with academic uses of generative AI tools other than those currently supported or vetted by the University. Instructors may be eager to experiment with tools from OpenAI, Anthropic, Google, and others that are customized for specific teaching applications, such as chatbots grounded in course content. The aim of this guide is to outline considerations when engaging with tools that fall outside of the University’s “walled garden” of institutional support, including maximizing data privacy and security, as well as respecting intellectual property and copyright. The precautions outlined in the Tools Beyond Quercus section of the CTSI website remains a relevant framework to consider risks and implications of engaging with non-supported tools, and this guide highlights some key points to consider for generative AI tools specifically.
Considerations
Rationale and Pedagogical Value
- Clarify how the non-supported AI tool supports learning outcomes or instructional strategies that cannot be achieved using supported tools alone (i.e., Microsoft Copilot).
- For custom course chatbots, consider how your “system prompt” will allow you to customize the instructional strategy and student experience. For example, if you want a custom chatbot that will act like an expert tutor to help students build their conceptual knowledge, consider adapting Lilach and Ethan Mollick’s “Updated Tutoring Prompt”.
Cost and Sustainability
- The University does not directly cover the costs associated with the use of non-supported tools. However, faculty members may consider using funds such as the Professional Expense Reimbursement Allowance (PERA).
- Since tool functionality and availability is changing rapidly, consider the impact to your course design should the cost no longer be sustainable, or should the tool no longer be available.
- Students cannot be required to use or pay for non-supported tools.
- Institutional educational technology staff do not provide end-user support for non-supported tools. Instructors are responsible for supporting such tools that they choose to include in their course, including training and support for TAs and students.
Equity, Transparency, and Accessibility
- Students cannot be compelled to create accounts on non-University systems or services.
- Non-supported tools that are to be used by students as part of course activities must be explicitly listed on the course syllabus:
- Clarify whether students are required to register or create an account, and let students know not use the same password as the one they use to access UTORid-enabled services.
- Provide students with the context for what they are expected to do using the non-supported tool (e.g., for use as an optional, supplementary course tutor).
- Provide links to vendor/product statements regarding privacy and use of information, including the end-user license agreement (EULA), if applicable.
- If a non-supported tool is used as part of a required course activity or assessment, a viable alternative must be made available for students who do not consent to participate.
- Investigate wither the tool presents any accessibility barriers to students with disabilities. Consider consulting with U of T’s Accessibility for Ontarians with Disabilities Act (AODA) Office to assess whether the tool will meet your pedagogical goals for all students.
Privacy and Security
- Do not share personal information or other data that resides within levels 2 to 4 of the Data Classification Standard with generative AI systems unless that sharing has been specifically authorized.
- Investigate if and how user data will be used by the tool’s supplier (e.g. for training AI models, for marketing, or shared with any third-party vendors)
- If applicable, provide instructions for how students can engage with the tool without sharing their interaction history. For example, if you are using OpenAI’s GPT Builder, show students how to engage with ChatGPT using its temporary chat mode, and how to opt out of sharing conversation data completely.
- Ensure that you have reviewed and are aware of the implications of the tool’s EULA. Consider the following questions:
- Is personal information shared and with whom?
- What is the privacy policy?
- Where is data stored?
- Remind students to avoid including any personal details in prompts.
- Refer to the U of T Information Security department’s “Use artificial intelligence intelligently” resource for more information.
Content Preparation
- If you are uploading course content or other documents to a generative AI tool (e.g., for a custom course chatbot), the content should be extensive, accurate, and well-organized. High-quality training material ensures that the tool can generate responses that are as relevant and accurate as possible.
- For detailed guidance on content preparation, refer to CTSI’s “Preparing Content for Custom AI Chatbots” resource.
Copyright and Intellectual Property
- Do not include any copyrighted material (including full-text library licensed e-resources) without authorization.
- For custom AI tools that have a share link for student use, assume that they can potentially be accessed by any user, with query results exposing your intellectual property outside of the University. Consider whether this aligns with your personal view on sharing course or domain related materials.
- See the U of T Library’s Generative AI tools and Copyright Considerations resource for up-to-date guidelines on copyright and intellectual property.
Testing for Accuracy, Consistency and Bias
- Although uploading curated content into a custom AI tool will generally reduce the rate of factually inaccurate outputs (“hallucinations”), AI-generated text and code are not guaranteed to be correct.
- Test whether the tool can grasp the context of questions, understanding linguistic nuances and domain-specific elements.
- Test the tool with real-world questions and scenarios that students might encounter. Assess the responses and determine if they are within acceptable standards for your use case. This helps to identify the tool’s ability to provide accurate and contextually appropriate responses. Consider testing using the prompts included in CTSI’s “AI Virtual Tutor – Effective Prompting Strategies” resource.
- Consider testing how the tool responds to the same topic/question when phrased slightly differently. This will help assess the consistency of the responses.
- Through your testing, assess potential biases in the AI tool relevant to your field and plan to address them – e.g., by uploading additional content or adjusting your system prompt.
- Try some prompts that purposely attempt to “trick” the tool into providing inappropriate responses, “hallucinations”, or going off topic to see how it responds.
- Some tools may have a setting to enable content moderation/filtering for internet-enabled searches (e.g., block profanity, hate language, etc.).
- Students should be reminded that AI tools should not be treated as an “authoritative source” – for example, students should continue to refer to the course syllabus directly for key dates and course policies.
Example – Creating a Virtual Tutor using GPT Builder
The following is a step-by-step guide on creating and configuring a custom AI chatbot that acts as a virtual tutor using OpenAI’s GPT Builder:
Step 1: Access GPT Builder
- Create or log in to your ChatGPT Plus account at https://chat.openai.com.
- In the left sidebar, click on “Explore GPTs”.
- Under “My GPTs”, click the “Create a GPT” button.
Step 2: Provide Initial Instructions
- In the “Create” tab of GPT Builder, it will ask you “What would you like to make?” – this will initiate a dialogue to determine specifics about your GPT’s purpose and behavior based on your initial prompt.
- For example, your initial prompt might be: “Create an expert tutor to help students in ECO101 understand basic microeconomic concepts”.
- GPT Builder will then respond with specific questions about the name, logo, and purpose of the GPT. Iterating several times with the GPT Builder through natural conversation is recommended to provide the model with a clear understanding of what it should be doing.
Step 3: Refine Your Instructions
- Switch to the “Configure” tab for more advanced customization options.
- Under “Instructions”, review the virtual tutor instructions that were generated by your dialog with the GPT Builder to ensure that it is consistent with your instructional objectives.
- Consider adapting Lilach and Ethan Mollick’s “Updated Tutoring Prompt”
- You may also want to add to the instructions; e.g., “Do not answer any questions or engage in dialogue about any topic other than…”
Step 4: Upload Knowledge Files
- Under “Knowledge”, you can upload up to 20 files containing information you want your GPT to reference.
- Do not include any copyrighted material (including full-text library licensed e-resources) without authorization.
- Refer to the U of T Library’s Generative AI tools and Copyright Considerations for more information about copyright.
- Do not include material that contains any sensitive personal information about individuals.
- In the “Instructions” section of GPT Builder, include “When referencing materials from the “Knowledge” section, provide a full citation to ensure proper attribution.”
Step 5: Enable Relevant Capabilities with Caution
- In the “Capabilities” section, you have the option to enable “Web Browsing”, “DALL·E Image Generation”, and “Code Interpreter & Data Analysis”.
- Disabling “Web Browsing” will prevent the virtual tutor from accessing up-to-date information on the internet that is not included in its training data or your uploaded content. This may or may not be desirable, depending on your objective.
- Be aware the code interpretation capability will allow users to download your uploaded content in its native format.
Step 6: Configure Data Sharing Settings
- In the “Configure” tab of GPT Builder, scroll down to “Additional Settings”
- You may uncheck the box for “Use conversation data in your GPT to improve our models”. This will prevent the content of user conversations with your virtual tutor from being used by OpenAI for model training and improvement.
- It is important to note that the privacy settings of your virtual tutor users will take priority over the setting above: if a user chooses to share their chat history to improve model training and development, this will expose their own chat history with your virtual tutor regardless of the setting you choose.
Step 7: Test and Refine
- Use the “Preview” panel to test out your virtual tutor and see how it responds.
- Iteratively update your instructions and settings based on its performance.
- Monitor for potential biases or inaccuracies and refine your instructions or uploaded files if necessary. For example, you might notice through testing your virtual tutor that a lecture transcript you uploaded contains errors.
Step 8: Publish Your Virtual Tutor
- Once finalized, select the “Save” button and choose your sharing settings.
- Selecting the “Anyone with the link” option will allow students to access your chatbot but won’t enable public browsing.
- Selecting “GPT Store” will publish your virtual tutor to OpenAI’s public “Explore GPTs” page.
- Select “Confirm” to complete the publishing process.
^ back to top
How Generative AI Works
- Getting Started with Generative AI
- Opportunities and Challenges of Generative AI in Classroom Learning
Getting Starting with Generative AI
Generative AI refers to artificial intelligence that can generate new content, including text, images, and other media, based on predictive modeling. It uses machine learning algorithms, specifically neural networks, to process and learn from large datasets. Large language models (LLMs) are one class of generative AI, and they have the ability to generate human-like text.
Are you interested in learning how generative AI works? A good introduction is provided by the Schwartz Reisman Institute for Technology and Society’s “What are LLMs and generative AI? A beginners’s guide to the technology turning heads”. You could also consider:
The responses you receive from a generative AI tool depend on the prompts you enter, and the further refining of these prompts, which takes practice. As Ethan Mollick said, “The lesson is that just using AI will teach you how to use AI.” (Working with AI: Two Paths to Prompting) To get started, we recommend you consider the following.
Be clear about what you want. Include detailed information in your prompt, including the desired format. “Write a paragraph about…” “Create an image containing…” Suggest a particular style (e.g., an academic essay or lab report) and include specific information you want to include (e.g., provide an outline or ordered steps for the prompt).
- If you’re not sure how to describe the style you want to emulate, the Wharton School at the University of Pennsylvania suggests pasting in a text example you like and asking the tool to describe the style Use that description in your own prompt for style.
- To learn more on prompt writing, see our Tool Guide under “How can I prompt with Copilot?.”
Be critical.
- Does the tool output meet your needs? What additional information is required? Generative AI is an interactive tool. Try different options and prompts to gauge the results, clear the prompt screen and try again. You will learn to refine your prompts and better discern what is most effective with practice.
- Generative AI tools can provide quick results that may appear correct, but looks can be deceiving. Tools such as ChatGPT can produce hallucinations or misleading and factually incorrect text. As with any text or visual analysis, we need to examine the results with a critical eye.
Before deciding whether and how to integrate Generative AI into your course, it is essential to have a solid understanding of its capabilities in supporting learning, as well as the challenges it presents. This section provides a basic overview of generative AI as it relates to teaching and learning.
Opportunities and Challenges of Generative AI in Classroom Learning
Tools that leverage generative artificial intelligence (GenAI) and large language models (LLMs) to generate new code or text (e.g., Copilot, ChatGPT, Claude, Gemini, etc.) are becoming increasingly available and are likely to have long-term impacts and on what and how we teach.
Higher education has faced similar disruptions with previous technology innovations, including calculators, Google search, and Wikipedia. While these innovations can be disruptive to our practices of teaching and assessment, incorporating them into our teaching practice is also an opportunity to prepare our learners to live and thrive in a changing world. When intentionally leveraged for classroom instruction, generative AI technologies may also provide new possibilities for enhancing accessibility and engagement for students with varied learning needs.
When integrating GenAI into your courses, there are several opportunities to foster an inclusive learning environment:
- Encourage AI literacy and future readiness. As generative AI tools develop in capabilities, they will continue to shape what distinctive human skills are prioritized across disciplines and fields of work. By incorporating opportunities within courses to explore, use, and assess generative AI tools, students are better prepared to strategically engage with and critically reflect on these emerging technologies.
- Encourage metacognition, creativity, and critical thinking. By designing carefully constructed assessments and forms of active engagement, students have the opportunity to explore different viewpoints, self-reflect, and engage in analysis and knowledge synthesis.
- Address barriers to equity and accessibility. Generative AI tools can be leveraged to provide multiple options for motivating and engaging learners, for representing information, and for inviting learners to express and communicate. This can create learning experiences that are personalized to students’ diverse needs and abilities.
When integrating GenAI into your courses, there are several considerations to ensure equitable and responsible use, including:
- Availability. While many generative AI tools are currently freely available, their availability could change at any time. For any third-party software that is not approved by the University or your Division, there are several considerations related to privacy, security and student intellectual property that should be considered before asking your students to use non-approved generative AI tools.
- Accuracy and bias. Text created by generative AI technology may be biased and may not be correct.
- Academic integrity. The University is discouraging the use of tools that claim to be able to detect AI generated text. See Generative Artificial Intelligence in the Classroom: FAQ’s for more information.
- Privacy and security. A version of Microsoft Copilot is currently available to the public (and U of T students), however the public version does not have full privacy and data protections in place. U of T has access to the enterprise edition of Microsoft Copilot, which conforms to the University’s privacy and data protections, unlike the public version. Note that other publicly available generative AI chatbots like ChatGPT may not offer such privacy and data protections.
- Copyright and intellectual property. It is important to be mindful of what content is entered into generative AI platforms that do not have institutional data protections in place. Never input confidential information or intellectual property for which you do not have the rights or permissions. All content entered may become part of the tool’s dataset and may inadvertently resurface in response to other prompts. See the U of T Library’s Generative AI tools and Copyright Considerations for more information.
You can learn more about strategies to plan for and integrate generative AI into courses by visiting the Teach with Generative AI section of this website.
^ back to top
AI Tools Within and Beyond U of T’s Protected Environment
When deciding whether and how to integrate generative AI into classroom environments, it is valuable to consider which tools are institutionally-supported at the University of Toronto.
The below resource differentiates between two categories of generative AI tools:
- Inside the Walled Garden: Generative AI tools that operate within U of T’s protected digital environment, ensuring institutional support and compliance with university policies.
- Outside the Walled Garden: Generative AI tools that fall beyond the University’s officially supported toolkit, which may require additional consideration before use.
Within the Walled Garden

Outside the Walled Garden

Features:
- Guardrails that prioritize security, privacy and vendor/provider accountability
- Once logged into your U of T account, the protected version of Microsoft Copilot does not collect information from users’ prompts for training purposes
- Ease of use/access due to institutional support, configuration for teaching and learning context
- Equitable access for all students without user subscription fees
- Greater predictability of services provided
Features:
- Most recent developments available from range of platform providers
- Access to discipline and task specific tools and resources
- Additional customization possible for innovative teaching with individual instructor investment in build and support
- Potential for embedding within other tools/platform contexts
- Enhanced or specialized services on a subscription basis
Risks and Concerns:
- Limited to platform and configuration options provided by institutionally supported and/or vetted options
- Reduced opportunity for customization and integration
Risks and Concerns:
- Content and data privacy concerns in sharing with platform/provider for model training/third party use
- Students must be provided with an alternative to using a platform that is not one of the University of Toronto’s supported tools
- Fee for service limits equitable access
- Unsupported tools may disappear or have terms of service or functionality changed during the term.
- Instructor responsible for support, including compliance with all U of T policies and practices on appropriate use of information technology and information privacy and security risks
Example: ChatGPT | Open AI services
Visit the Other Tools tab to
read more about using ChatGPT to build course chat bot
This content is available for PDF download for use in a range of instructor support contexts.
Current Approved Tools
Microsoft Copilot
Currently, Microsoft Copilot is the recommended generative AI tool to use at U of T. When a user signs in using University credentials, Microsoft Copilot conforms to U of T’s privacy and security standards (i.e., does not share any data with Microsoft or any other company) for use with up to level 3 data.
It is also free to use. Microsoft Copilot uses OpenAI’s GPT-4 model and performs comparably to ChatGPT. For more information, refer to our Copilot Tool Guide.
Microsoft Copilot is a general-purpose chatbot. With thoughtful prompting, it can be used for many teaching and learning purposes (e.g., see Ethan and Lilach Mollick’s More Useful Things site for a comprehensive prompt library). However, many other educational generative AI chatbots and tools are being developed that incorporate pre-defined prompts out of the box.
ChatGPT Edu
ChatGPT Edu is an advanced version of OpenAI’s ChatGPT, offering enhanced security and privacy protections. Available to the University of Toronto through an enterprise agreement with OpenAI, the contract ensures UofT data is stored safely and not used for training OpenAI models. Users will have access to the latest model of ChatGPT, including features like Advanced Data Analysis, GPT Builder tool, DALL·E image creation/editing.
Please note that ChatGPT Edu is a separate paid service, and that free ChatGPT is NOT in protected mode. Ensure you have confirmed paid access from the Licensed Software Office (LSO) to access the U of T ChatGPT Edu version of ChatGPT.
For more information, refer to our ChatGPT Edu Tool Guide.
Contact North
Contact North is a not-for-profit, Government of Ontario-supported organization that has developed two generative AI teaching and learning tools (also based on OpenAI’s GPT-4 model), both of which conform to U of T’s privacy and security standards for use with level one and level two data:
- AI Teacher’s Assistant Pro guides instructors though AI-assisted syllabus and teaching resource creation
- AI Tutor Pro is a resource for students to build and test their knowledge
M365 Copilot
Microsoft Copilot is a brand of generative artificial intelligence developed by Microsoft, currently based on OpenAI technology. Within Microsoft 365 (M365) applications, M365 Copilot is a virtual agent that can generate contextual content when prompted, based in part on the documents in OneDrive and SharePoint to which the user has access.
Please note that the M365 Copilot version of Copilot is different from the Protected Edition of Copilot Search and requires an additional fee-for-service licence.
Teams Premium
Teams Premium offers an AI meeting assistant for staff and instructors. Those interested in using this resource may submit a request via the ESC.
For more information, please read this overview from Microsoft: How does Teams Premium compare to Teams?
Azure AI Foundry
Azure AI Foundry is Microsoft’s integrated platform designed to streamline the development, deployment, and management of AI applications. Azure AI Foundry can be deployed as part of an AI landing zone, which includes all the tools and services needed to develop, train, and deploy AI models.
For more information, please read this overview from Microsoft: What is Azure AI Foundry?
Web of Science (WoS) Research Assistant Tool
The University of Toronto Libraries has licensed the Web of Science (WoS) Research Assistant Tool, as an add-on. It has been enabled in the UofT subscription account for the University community access.
To use the tool, navigate to the Web of Science database and select the Research Assistant tab from the top menu bar. The Research Assistant (RA), a generative-AI-powered tool based on the corpus of the Web of Science Database, can retrieve articles, organize content, and formulate responses to questions based on the corpus within WoS—including references to the source material. The tool also offers visualizations and guided tasks
Scopus AI
The University of Toronto Libraries has subscribed to Scopus AI, enabling access for the University community as part of its Scopus database subscription. Scopus AI is an intuitive and intelligent search tool powered by generative AI that provides enhanced search results from the Elsevier catalogue for materials from 2013 onward.
It can cite referenced texts, provide article summaries, suggest “go deeper” questions, generate concept maps, and more. Scopus AI is designed to help researchers efficiently find relevant information, explore topics, and facilitate effective collaborations across disciplines
Other tools that conform to U of T’s privacy and security standards will be referenced here as they become available.
Artificial Intelligence Virtual Tutor Initiative
Overview
The Artificial Intelligence Virtual Tutor Initiative supports instructors in designing and implementing course-specific virtual tutors powered by Generative AI. These chatbots are tailored to course materials and provide students with responsive, on-demand support to support teaching and learning in participating courses.
Throughout the initiative, instructors collaborate with support staff and a cohort of peers during the design, deployment, and evaluation phases. This collaborative approach helps build a shared understanding of effective virtual tutoring practices in higher education.

Sponsored by the Centre for Teaching Support & Innovation (CTSI) and Information Technology Services (ITS), the initiative offers:
- A virtual orientation session with training on content preparation
- Ongoing support through interim checkpoint surveys and a final project report to capture challenges and opportunities
- Participation in evaluation activities to inform future institutional planning
- Dedicated infrastructure and staff support for each project team’s virtual tutor
If you have any questions, contact: ctsi.teaching@utoronto.ca
Cogniti Pilot (2025-present)
As part of the current phase of the Virtual Tutor Initiative, the University of Toronto is transitioning to using Cogniti: a generative AI chatbot, similar to ChatGPT or Microsoft Copilot, but specifically tailored to course content and integrated within Quercus. Cogniti is designed to support learning by providing helpful responses to course content-related questions.
The University has thoroughly vetted the platform for data privacy and security. Please note that the accuracy and relevance of the answers will vary depending on the prompt used and the performance of the AI Virtual Tutor, which may evolve over the term as the technology advances and we refine the settings of the model.
If you are interested in participating in the Virtual Tutor Initiative, please complete the expression of interest form.
Recording - Previewing Cogniti: Creating AI Virtual Tutors for Your Courses
A recording is now available of a hands-on session introducing Cogniti, the generative AI platform piloted by the University of Toronto as part of the Virtual Tutor Initiative. The session shows how instructors can create AI virtual tutors tailored to their courses, gain insights into student engagement, and integrate Cogniti seamlessly with Quercus. Designed for accessibility and ease of use across disciplines, this recording is ideal for anyone curious about AI in education or interested in piloting AI tutors.
Instructor Resources
To support instructors in integrating virtual tutors in their courses, CTSI has developed resources available for download:
- AI Virtual Tutors – Effective Prompting Strategies
Offers evidence-based prompting techniques and sample questions to help students get the most out of AI virtual tutors, supporting personalized learning, critical thinking, and deeper understanding of course material.
Web page | PDF | Word - Preparing Content for Custom AI Chatbots
Guidance on how to organize, clean, and format course materials for use with custom AI chatbots, ensuring accurate, relevant, and efficient AI responses.
Web page | PDF | Word
Initial Pilot with Microsoft Copilot Studio (2024-2025)
The Virtual Tutor initiative, launched in Fall 2024 using Microsoft Copilot Studio, began with four courses and 573 students. By Winter 2025, the pilot expanded to nine courses, reaching a total of 832 students.
Key Insights:
- Instructors explored how AI virtual tutors could support innovative teaching methods and create more personalized learning experiences. The tutors also helped instructors identify specific areas where students needed additional support, such as understanding complex theories and developing AI literacy skills.
- Early observations indicate students asked increasingly higher-order questions involving analysis and evaluation, potentially suggesting deeper engagement and conceptual understanding over time.
- Diverse learners leveraged the virtual tutor’s personalized scaffolding to structure their learning tasks at their own pace, potentially developing their capacity to meet and positively adapt to adversity.
The virtual tutor ”has the potential to foster equity in the learning environment if used by novice learners, allowing them to learn at their own pace.”
– Emily Ho, Assistant Professor, Occupational Science & Occupational Therapy
Cohort Leads:
- Michelle Arnot, Department of Pharmacology and Toxicology, Temerty Faculty of Medicine
- Charlene Chu, Lawrence Bloomberg Faculty of Nursing
- Joseanne Cudjoe, Department of Arts, Culture and Media, University of Toronto Scarborough
- Emily Ho, Department of Occupational Science and Occupational Therapy, Temerty Faculty of Medicine
- Nohjin Kee, Department of Physiology, Temerty Faculty of Medicine
These early findings highlight the potential of virtual tutors to advance teaching innovation, student engagement, and equitable learning support.
Engaging with Non-Supported Generative AI Tools
Introduction
The University of Toronto recognizes the growing interest among instructors to engage with academic uses of generative AI tools other than those currently supported or vetted by the University. Instructors may be eager to experiment with tools from OpenAI, Anthropic, Google, and others that are customized for specific teaching applications, such as chatbots grounded in course content. The aim of this guide is to outline considerations when engaging with tools that fall outside of the University’s “walled garden” of institutional support, including maximizing data privacy and security, as well as respecting intellectual property and copyright. The precautions outlined in the Tools Beyond Quercus section of the CTSI website remains a relevant framework to consider risks and implications of engaging with non-supported tools, and this guide highlights some key points to consider for generative AI tools specifically.
Considerations
Rationale and Pedagogical Value
- Clarify how the non-supported AI tool supports learning outcomes or instructional strategies that cannot be achieved using supported tools alone (i.e., Microsoft Copilot).
- For custom course chatbots, consider how your “system prompt” will allow you to customize the instructional strategy and student experience. For example, if you want a custom chatbot that will act like an expert tutor to help students build their conceptual knowledge, consider adapting Lilach and Ethan Mollick’s “Updated Tutoring Prompt”.
Cost and Sustainability
- The University does not directly cover the costs associated with the use of non-supported tools. However, faculty members may consider using funds such as the Professional Expense Reimbursement Allowance (PERA).
- Since tool functionality and availability is changing rapidly, consider the impact to your course design should the cost no longer be sustainable, or should the tool no longer be available.
- Students cannot be required to use or pay for non-supported tools.
- Institutional educational technology staff do not provide end-user support for non-supported tools. Instructors are responsible for supporting such tools that they choose to include in their course, including training and support for TAs and students.
Equity, Transparency, and Accessibility
- Students cannot be compelled to create accounts on non-University systems or services.
- Non-supported tools that are to be used by students as part of course activities must be explicitly listed on the course syllabus:
- Clarify whether students are required to register or create an account, and let students know not use the same password as the one they use to access UTORid-enabled services.
- Provide students with the context for what they are expected to do using the non-supported tool (e.g., for use as an optional, supplementary course tutor).
- Provide links to vendor/product statements regarding privacy and use of information, including the end-user license agreement (EULA), if applicable.
- If a non-supported tool is used as part of a required course activity or assessment, a viable alternative must be made available for students who do not consent to participate.
- Investigate wither the tool presents any accessibility barriers to students with disabilities. Consider consulting with U of T’s Accessibility for Ontarians with Disabilities Act (AODA) Office to assess whether the tool will meet your pedagogical goals for all students.
Privacy and Security
- Do not share personal information or other data that resides within levels 2 to 4 of the Data Classification Standard with generative AI systems unless that sharing has been specifically authorized.
- Investigate if and how user data will be used by the tool’s supplier (e.g. for training AI models, for marketing, or shared with any third-party vendors)
- If applicable, provide instructions for how students can engage with the tool without sharing their interaction history. For example, if you are using OpenAI’s GPT Builder, show students how to engage with ChatGPT using its temporary chat mode, and how to opt out of sharing conversation data completely.
- Ensure that you have reviewed and are aware of the implications of the tool’s EULA. Consider the following questions:
- Is personal information shared and with whom?
- What is the privacy policy?
- Where is data stored?
- Remind students to avoid including any personal details in prompts.
- Refer to the U of T Information Security department’s “Use artificial intelligence intelligently” resource for more information.
Content Preparation
- If you are uploading course content or other documents to a generative AI tool (e.g., for a custom course chatbot), the content should be extensive, accurate, and well-organized. High-quality training material ensures that the tool can generate responses that are as relevant and accurate as possible.
- For detailed guidance on content preparation, refer to CTSI’s “Preparing Content for Custom AI Chatbots” resource.
Copyright and Intellectual Property
- Do not include any copyrighted material (including full-text library licensed e-resources) without authorization.
- For custom AI tools that have a share link for student use, assume that they can potentially be accessed by any user, with query results exposing your intellectual property outside of the University. Consider whether this aligns with your personal view on sharing course or domain related materials.
- See the U of T Library’s Generative AI tools and Copyright Considerations resource for up-to-date guidelines on copyright and intellectual property.
Testing for Accuracy, Consistency and Bias
- Although uploading curated content into a custom AI tool will generally reduce the rate of factually inaccurate outputs (“hallucinations”), AI-generated text and code are not guaranteed to be correct.
- Test whether the tool can grasp the context of questions, understanding linguistic nuances and domain-specific elements.
- Test the tool with real-world questions and scenarios that students might encounter. Assess the responses and determine if they are within acceptable standards for your use case. This helps to identify the tool’s ability to provide accurate and contextually appropriate responses. Consider testing using the prompts included in CTSI’s “AI Virtual Tutor – Effective Prompting Strategies” resource.
- Consider testing how the tool responds to the same topic/question when phrased slightly differently. This will help assess the consistency of the responses.
- Through your testing, assess potential biases in the AI tool relevant to your field and plan to address them – e.g., by uploading additional content or adjusting your system prompt.
- Try some prompts that purposely attempt to “trick” the tool into providing inappropriate responses, “hallucinations”, or going off topic to see how it responds.
- Some tools may have a setting to enable content moderation/filtering for internet-enabled searches (e.g., block profanity, hate language, etc.).
- Students should be reminded that AI tools should not be treated as an “authoritative source” – for example, students should continue to refer to the course syllabus directly for key dates and course policies.
Example – Creating a Virtual Tutor using GPT Builder
The following is a step-by-step guide on creating and configuring a custom AI chatbot that acts as a virtual tutor using OpenAI’s GPT Builder:
Step 1: Access GPT Builder
- Create or log in to your ChatGPT Plus account at https://chat.openai.com.
- In the left sidebar, click on “Explore GPTs”.
- Under “My GPTs”, click the “Create a GPT” button.
Step 2: Provide Initial Instructions
- In the “Create” tab of GPT Builder, it will ask you “What would you like to make?” – this will initiate a dialogue to determine specifics about your GPT’s purpose and behavior based on your initial prompt.
- For example, your initial prompt might be: “Create an expert tutor to help students in ECO101 understand basic microeconomic concepts”.
- GPT Builder will then respond with specific questions about the name, logo, and purpose of the GPT. Iterating several times with the GPT Builder through natural conversation is recommended to provide the model with a clear understanding of what it should be doing.
Step 3: Refine Your Instructions
- Switch to the “Configure” tab for more advanced customization options.
- Under “Instructions”, review the virtual tutor instructions that were generated by your dialog with the GPT Builder to ensure that it is consistent with your instructional objectives.
- Consider adapting Lilach and Ethan Mollick’s “Updated Tutoring Prompt”
- You may also want to add to the instructions; e.g., “Do not answer any questions or engage in dialogue about any topic other than…”
Step 4: Upload Knowledge Files
- Under “Knowledge”, you can upload up to 20 files containing information you want your GPT to reference.
- Do not include any copyrighted material (including full-text library licensed e-resources) without authorization.
- Refer to the U of T Library’s Generative AI tools and Copyright Considerations for more information about copyright.
- Do not include material that contains any sensitive personal information about individuals.
- In the “Instructions” section of GPT Builder, include “When referencing materials from the “Knowledge” section, provide a full citation to ensure proper attribution.”
Step 5: Enable Relevant Capabilities with Caution
- In the “Capabilities” section, you have the option to enable “Web Browsing”, “DALL·E Image Generation”, and “Code Interpreter & Data Analysis”.
- Disabling “Web Browsing” will prevent the virtual tutor from accessing up-to-date information on the internet that is not included in its training data or your uploaded content. This may or may not be desirable, depending on your objective.
- Be aware the code interpretation capability will allow users to download your uploaded content in its native format.
Step 6: Configure Data Sharing Settings
- In the “Configure” tab of GPT Builder, scroll down to “Additional Settings”
- You may uncheck the box for “Use conversation data in your GPT to improve our models”. This will prevent the content of user conversations with your virtual tutor from being used by OpenAI for model training and improvement.
- It is important to note that the privacy settings of your virtual tutor users will take priority over the setting above: if a user chooses to share their chat history to improve model training and development, this will expose their own chat history with your virtual tutor regardless of the setting you choose.
Step 7: Test and Refine
- Use the “Preview” panel to test out your virtual tutor and see how it responds.
- Iteratively update your instructions and settings based on its performance.
- Monitor for potential biases or inaccuracies and refine your instructions or uploaded files if necessary. For example, you might notice through testing your virtual tutor that a lecture transcript you uploaded contains errors.
Step 8: Publish Your Virtual Tutor
- Once finalized, select the “Save” button and choose your sharing settings.
- Selecting the “Anyone with the link” option will allow students to access your chatbot but won’t enable public browsing.
- Selecting “GPT Store” will publish your virtual tutor to OpenAI’s public “Explore GPTs” page.
- Select “Confirm” to complete the publishing process.
How Generative AI Works
- Getting Started with Generative AI
- Opportunities and Challenges of Generative AI in Classroom Learning
Getting Starting with Generative AI
Generative AI refers to artificial intelligence that can generate new content, including text, images, and other media, based on predictive modeling. It uses machine learning algorithms, specifically neural networks, to process and learn from large datasets. Large language models (LLMs) are one class of generative AI, and they have the ability to generate human-like text.
Are you interested in learning how generative AI works? A good introduction is provided by the Schwartz Reisman Institute for Technology and Society’s “What are LLMs and generative AI? A beginners’s guide to the technology turning heads”. You could also consider:
The responses you receive from a generative AI tool depend on the prompts you enter, and the further refining of these prompts, which takes practice. As Ethan Mollick said, “The lesson is that just using AI will teach you how to use AI.” (Working with AI: Two Paths to Prompting) To get started, we recommend you consider the following.
Be clear about what you want. Include detailed information in your prompt, including the desired format. “Write a paragraph about…” “Create an image containing…” Suggest a particular style (e.g., an academic essay or lab report) and include specific information you want to include (e.g., provide an outline or ordered steps for the prompt).
- If you’re not sure how to describe the style you want to emulate, the Wharton School at the University of Pennsylvania suggests pasting in a text example you like and asking the tool to describe the style Use that description in your own prompt for style.
- To learn more on prompt writing, see our Tool Guide under “How can I prompt with Copilot?.”
Be critical.
- Does the tool output meet your needs? What additional information is required? Generative AI is an interactive tool. Try different options and prompts to gauge the results, clear the prompt screen and try again. You will learn to refine your prompts and better discern what is most effective with practice.
- Generative AI tools can provide quick results that may appear correct, but looks can be deceiving. Tools such as ChatGPT can produce hallucinations or misleading and factually incorrect text. As with any text or visual analysis, we need to examine the results with a critical eye.
Before deciding whether and how to integrate Generative AI into your course, it is essential to have a solid understanding of its capabilities in supporting learning, as well as the challenges it presents. This section provides a basic overview of generative AI as it relates to teaching and learning.
Opportunities and Challenges of Generative AI in Classroom Learning
Tools that leverage generative artificial intelligence (GenAI) and large language models (LLMs) to generate new code or text (e.g., Copilot, ChatGPT, Claude, Gemini, etc.) are becoming increasingly available and are likely to have long-term impacts and on what and how we teach.
Higher education has faced similar disruptions with previous technology innovations, including calculators, Google search, and Wikipedia. While these innovations can be disruptive to our practices of teaching and assessment, incorporating them into our teaching practice is also an opportunity to prepare our learners to live and thrive in a changing world. When intentionally leveraged for classroom instruction, generative AI technologies may also provide new possibilities for enhancing accessibility and engagement for students with varied learning needs.
When integrating GenAI into your courses, there are several opportunities to foster an inclusive learning environment:
- Encourage AI literacy and future readiness. As generative AI tools develop in capabilities, they will continue to shape what distinctive human skills are prioritized across disciplines and fields of work. By incorporating opportunities within courses to explore, use, and assess generative AI tools, students are better prepared to strategically engage with and critically reflect on these emerging technologies.
- Encourage metacognition, creativity, and critical thinking. By designing carefully constructed assessments and forms of active engagement, students have the opportunity to explore different viewpoints, self-reflect, and engage in analysis and knowledge synthesis.
- Address barriers to equity and accessibility. Generative AI tools can be leveraged to provide multiple options for motivating and engaging learners, for representing information, and for inviting learners to express and communicate. This can create learning experiences that are personalized to students’ diverse needs and abilities.
When integrating GenAI into your courses, there are several considerations to ensure equitable and responsible use, including:
- Availability. While many generative AI tools are currently freely available, their availability could change at any time. For any third-party software that is not approved by the University or your Division, there are several considerations related to privacy, security and student intellectual property that should be considered before asking your students to use non-approved generative AI tools.
- Accuracy and bias. Text created by generative AI technology may be biased and may not be correct.
- Academic integrity. The University is discouraging the use of tools that claim to be able to detect AI generated text. See Generative Artificial Intelligence in the Classroom: FAQ’s for more information.
- Privacy and security. A version of Microsoft Copilot is currently available to the public (and U of T students), however the public version does not have full privacy and data protections in place. U of T has access to the enterprise edition of Microsoft Copilot, which conforms to the University’s privacy and data protections, unlike the public version. Note that other publicly available generative AI chatbots like ChatGPT may not offer such privacy and data protections.
- Copyright and intellectual property. It is important to be mindful of what content is entered into generative AI platforms that do not have institutional data protections in place. Never input confidential information or intellectual property for which you do not have the rights or permissions. All content entered may become part of the tool’s dataset and may inadvertently resurface in response to other prompts. See the U of T Library’s Generative AI tools and Copyright Considerations for more information.
You can learn more about strategies to plan for and integrate generative AI into courses by visiting the Teach with Generative AI section of this website.
AI Tools Within and Beyond U of T’s Protected Environment
When deciding whether and how to integrate generative AI into classroom environments, it is valuable to consider which tools are institutionally-supported at the University of Toronto.
The below resource differentiates between two categories of generative AI tools:
- Inside the Walled Garden: Generative AI tools that operate within U of T’s protected digital environment, ensuring institutional support and compliance with university policies.
- Outside the Walled Garden: Generative AI tools that fall beyond the University’s officially supported toolkit, which may require additional consideration before use.
Within the Walled Garden

Features:
- Guardrails that prioritize security, privacy and vendor/provider accountability
- Once logged into your U of T account, the protected version of Microsoft Copilot does not collect information from users’ prompts for training purposes
- Ease of use/access due to institutional support, configuration for teaching and learning context
- Equitable access for all students without user subscription fees
- Greater predictability of services provided
Outside the Walled Garden

Features:
- Most recent developments available from range of platform providers
- Access to discipline and task specific tools and resources
- Additional customization possible for innovative teaching with individual instructor investment in build and support
- Potential for embedding within other tools/platform contexts
- Enhanced or specialized services on a subscription basis
Risks and Concerns:
- Limited to platform and configuration options provided by institutionally supported and/or vetted options
- Reduced opportunity for customization and integration
Risks and Concerns:
- Content and data privacy concerns in sharing with platform/provider for model training/third party use
- Students must be provided with an alternative to using a platform that is not one of the University of Toronto’s supported tools
- Fee for service limits equitable access
- Unsupported tools may disappear or have terms of service or functionality changed during the term.
- Instructor responsible for support, including compliance with all U of T policies and practices on appropriate use of information technology and information privacy and security risks
Example: ChatGPT | Open AI services
Visit the Other Tools tab to
read more about using ChatGPT to build course chat bot
AI Tools Within and Beyond U of T’s Protected Environment
When deciding whether and how to integrate generative AI into classroom environments, it is valuable to consider which tools are institutionally-supported at the University of Toronto.
The below resource differentiates between two categories of generative AI tools:
- Inside the Walled Garden: Generative AI tools that operate within U of T’s protected digital environment, ensuring institutional support and compliance with university policies.
- Outside the Walled Garden: Generative AI tools that fall beyond the University’s officially supported toolkit, which may require additional consideration before use.
| Within the Walled Garden: Protected Environment Supported by U of T | Outside the Walled Garden: Applications Beyond the University’s Supported Toolkit |
|---|---|
Features:
| Features:
|
Risks and Concerns:
| Risks and Concerns:
|
Example: Microsoft Copilot | Example: ChatGPT | Open AI services Visit the Outside the Walled Garden tab above to read more about using ChatGPT to build course chat bots |
![]() | ![]() |
This content is available in several formats for download for use in a range of instructor support contexts: PDF Format, Word Format.
These documents are licensed under a Creative Commons BY-NC-SA 4.0 International License.
Current Approved Tools
Currently, Microsoft Copilot is the recommended generative AI tool to use at U of T. When a user signs in using University credentials, Microsoft Copilot conforms to U of T’s privacy and security standards (i.e., does not share any data with Microsoft or any other company).
It is also free to use. Microsoft Copilot uses OpenAI’s GPT-4 model and performs comparably to ChatGPT. For more information about Copilot, refer to our Copilot Tool Guide.
Microsoft Copilot is a general-purpose chatbot. With thoughtful prompting, it can be used for many teaching and learning purposes (e.g., see Ethan and Lilach Mollick’s More Useful Things site for a comprehensive prompt library). However, many other educational generative AI chatbots and tools are being developed that incorporate pre-defined prompts out of the box.
Contact North is a not-for-profit, Government of Ontario-supported organization that has developed two generative AI teaching and learning tools (also based on OpenAI’s GPT-4 model), both of which conform to U of T’s privacy and security standards for use with level one and level two data:
- AI Teacher’s Assistant Pro guides instructors though AI-assisted syllabus and teaching resource creation
- AI Tutor Pro is a resource for students to build and test their knowledge
Other tools that conform to U of T’s privacy and security standards will be referenced here as they become available.
Artificial Intelligence Virtual Tutor Program
Successful applicants have now been contacted for a new Artificial Intelligence Virtual Tutor Initiative. This pilot program offers an opportunity to co-develop a Virtual Tutor for use in the fall 2024 term.
The Virtual Tutor will be tailored to instructors’ course materials and will leverage AI to provide responses to students based on these materials. Successful applicants will have the operational costs for their Virtual Tutor and staff time for co-development covered by this program for one full production semester.
For additional information on this initiative and a link to the application form, please visit the Artificial Intelligence Virtual Tutor Program information page.
For more information contact ctsi.teaching@utoronto.ca.
GPT Builder for AI Virtual Tutors
July 2024
The University of Toronto recognizes the growing interest among instructors to create artificial intelligence virtual tutors. While the University has launched a pilot Artificial Intelligence Virtual Tutor Initiative, there is not yet a university-wide platform available for all instructors to use. We recognize that some instructors may be eager to experiment with tools like Open AI’s ChatGPT Builder and other options using their personal accounts in the absence of a supported tool. The aim of this guide is to describe how to build virtual tutors with ChatGPT Builder, including considerations for maximizing data privacy and security for students, as well as respecting intellectual property and copyright. While this guide focuses on ChatGPT Builder, the considerations outlined apply to any solution that falls outside of the University’s “walled garden” of educational technology that has been vetted through a procurement process, which includes a full privacy and security review.
What is a Virtual Tutor?
Large language models (LLMs) have already become part of our society, including profound impacts on teaching and learning. These models are pre-trained with an enormous amount of material and are able to execute a broad array of tasks by generating text, code, and image outputs to interact with users. However, many of these models lack specific domain knowledge, or are not easily tuned to the specific needs of particular classes or programs. While technical experts could fine-tune models to solve this problem, OpenAI (the creator of ChatGPT) has addressed this by allowing customizability through custom GPTs. This enables individuals to create custom chatbots tailored to their specific interests with no computer coding needed. A virtual tutor is a custom chatbot that is tailored to course materials and provides responses to students based on these materials.
Considerations Before You Begin
It is important to note that while free ChatGPT accounts can use a custom GPT, creating and maintaining one with GPT Builder requires a paid ChatGPT Pro account. The University does not directly cover the cost of a personal ChatGPT Plus account – however, you may consider using your Professional Expense Reimbursement Allowance (PERA) to cover the cost. It is also important to be aware that access, terms of service, and functionality of the tool may change without notice.
Privacy and Security
As ChatGPT has not been reviewed for privacy or security by the University, the precautions outlined in the Tools Beyond Quercus section of the CTSI website remains a relevant framework to consider risks and implications. Specific considerations for virtual tutors created using GPT Builder include:
- Provide instructions for how students can engage with ChatGPT without sharing their conversations. For example, show them how to engage with ChatGPT using its temporary chat mode, and how to opt out of sharing conversation data completely.
- Ensure that students are aware of ChatGPT’s Privacy Policy, and that they need to create an account to use your virtual tutor.
- Remind students to avoid inputting any personal details into prompts.
Student Equity and Transparency
- Students cannot be compelled to create accounts on non-University systems or services.
- Environments external to U of T that are to be used by students as part of course activities must be explicitly listed in the course syllabus.
- If an external virtual tutor is used as part of a course assessment, a viable alternative must be made available for students who do not consent to participation.
Accuracy
- Although uploading course materials into GPT Builder’s knowledge section will generally reduce the rate of hallucinations (i.e., factually inaccurate outputs), AI-generated text and code are not guaranteed to be correct.
- Students should be reminded that a virtual tutor should not be treated as an “authoritative source” – for example, students should continue to refer to the course syllabus directly for key dates and course policies.
Copyright and Intellectual Property
- Be mindful of what content is entered into your virtual tutor – never include confidential information or significant portions of intellectual property you do not have the rights or permissions to.
- All content entered may become part of the tool’s dataset and may inadvertently resurface in response to other prompts.
- Copyright and intellectual property guidelines should be shared with students as well.
- See the U of T Library’s Generative AI tools and Copyright Considerations for more information.
Setting Up a Virtual Tutor using GPT Builder
The following is a step-by-step guide on setting up and configuring a virtual tutor using GPT Builder:
Step 1: Access GPT Builder
- Create or log in to your ChatGPT Plus account at https://chat.openai.com.
- In the left sidebar, click on “Explore GPTs”.
- Under “My GPTs”, click the “Create a GPT” button.
Step 2: Provide Initial Instructions
- In the “Create” tab of GPT Builder, it will ask you “What would you like to make?” – this will initiate a dialogue to determine specifics about your GPT’s purpose and behavior based on your initial prompt.
- For example, your initial prompt might be: “Create an expert tutor to help students in ECO101 understand basic microeconomic concepts”.
- GPT Builder will then respond with specific questions about the name, logo, and purpose of the GPT. Iterating several times with the GPT Builder through natural conversation is recommended to provide the model with a clear understanding of what it should be doing.
Step 3: Refine Your Instructions
- Switch to the “Configure” tab for more advanced customization options.
- Under “Instructions”, review the virtual tutor instructions that were generated by your dialog with the GPT Builder to ensure that it is consistent with your instructional objectives. For example, if you want your chatbot to act like an expert tutor to help students build their conceptual knowledge, consider adapting Lilach and Ethan Mollick’s “Updated Tutoring Prompt”.
- You may also want to add to the instructions; e.g., “Do not answer any questions or engage in dialogue about any topic other than…”
Step 4: Upload Knowledge Files
- Under “Knowledge”, you can upload up to 20 files containing information you want your GPT to reference. For example, upload lecture slides and recorded transcripts, spreadsheets, problem sets, etc. Your GPT will use this information to provide more accurate and contextual responses.
- Be aware that your GPT can potentially be accessed by any user, with query results exposing your intellectual property outside of the University – evaluate if this aligns with your personal view on sharing course or domain related materials.
- Do not include any copyrighted material (including full-text library licensed e-resources) without authorization.
- Materials licensed under Creative Commons are generally acceptable to include with attribution (e.g., CC-BY-SA), though there may be some exceptions depending on the specific CC license – refer to the U of T Library’s Generative AI tools and Copyright Considerations for more information.
- Only include material that does not contain any sensitive personal information about individuals.
- In the “Instructions” section of GPT Builder, include “When referencing materials from the “Knowledge” section, provide a full citation to ensure proper attribution.”
Step 5: Enable Relevant Capabilities with Caution
- In the “Capabilities” section, you have the option to enable “Web Browsing”, “DALL·E Image Generation”, and “Code Interpreter & Data Analysis”.
- Disabling “Web Browsing” will prevent the virtual tutor from accessing up-to-date information on the internet that is not included in its training data or your uploaded content. This may or may not be desirable, depending on your objective.
- Be aware the code interpretation capability will allow users to download your uploaded content in its native format.
Step 6: Configure Data Sharing Settings
- In the “Configure” tab of GPT Builder, scroll down to “Additional Settings”
- You may uncheck the box for “Use conversation data in your GPT to improve our models”. This will prevent the content of user conversations with your virtual tutor from being used by OpenAI for model training and improvement.
- It is important to note that the privacy settings of your virtual tutor users will take priority over the setting above: if a user chooses to share their chat history to improve model training and development, this will expose their own chat history with your virtual tutor regardless of the setting you choose.
Step 7: Test and Refine
- Use the “Preview” panel to test out your virtual tutor and see how it responds.
- Iteratively update your instructions and settings based on its performance.
- Try some prompts that purposely attempt to “trick” the chatbot into saying bad things, making things up, or going off topic to see how it responds.
- Monitor for potential biases or inaccuracies and refine your instructions or uploaded files if necessary. For example, you might notice through testing your virtual tutor that a lecture transcript you uploaded contains errors.
Step 8: Publish Your Virtual Tutor
- Once finalized, click the “Save” button and choose your sharing settings.
- Selecting the “Anyone with the link” option will allow students to access your chatbot but won’t enable public browsing.
- Selecting “GPT Store” will publish your virtual tutor to OpenAI’s public “Explore GPTs” page.
- Click “Confirm” to complete the publishing process
This guide was inspired by Custom GPTs at MIT Sloan: A Comprehensive Guide, updated with current information, and adapted to the University of Toronto context.
How Generative AI Works
Are you interested in learning how generative AI works? A good introduction is provided by the Schwartz Reisman Institute for Technology and Society’s “What are LLMs and generative AI? A beginners’s guide to the technology turning heads”. You could also consider:
The responses you receive from a generative AI tool depend on the prompts you enter, and the further refining of these prompts, which takes practice. As Ethan Mollick said, “The lesson is that just using AI will teach you how to use AI.” (Working with AI: Two Paths to Prompting) To get started, we recommend you consider the following.
Be clear about what you want. Include detailed information in your prompt, including the desired format. “Write a paragraph about…” “Create an image containing…” Suggest a particular style (e.g., an academic essay or lab report) and include specific information you want to include (e.g., provide an outline or ordered steps for the prompt).
- If you’re not sure how to describe the style you want to emulate, the Wharton School at the University of Pennsylvania suggests pasting in a text example you like and asking the tool to describe the style Use that description in your own prompt for style.
- To learn more on prompt writing, see our Tool Guide under “How can I prompt with Copilot?.”
Be critical.
- Does the tool output meet your needs? What additional information is required? Generative AI is an interactive tool. Try different options and prompts to gauge the results, clear the prompt screen and try again. You will learn to refine your prompts and better discern what is most effective with practice.
- Generative AI tools can provide quick results that may appear correct, but looks can be deceiving. Tools such as ChatGPT can produce hallucinations or misleading and factually incorrect text. As with any text or visual analysis, we need to examine the results with a critical eye.
GenAI Sessions and Workshops
Explore: GenAI Workshop Series, GenAI Reading Group, GenAI Dialogue Series, or past webinars below.
Getting Started with Generative AI Tools at U of T
Every second Tuesday from August 27th to December 7th, 2pm-3pm
Join us for an interactive virtual drop-in session designed for U of T instructors and staff. We will provide an overview of the University’s approved generative AI tools with hands-on demonstrations. You can expect:
- A comparison of the various approved generative AI tools at the University of Toronto.
- A walkthrough of the secure login process for accessing U of T–approved generative AI tools.
- Live demonstrations of key features across various generative AI platforms.
- Open Q&A to address your questions, concerns, or ideas.
Drop in anytime between 2:00 PM and 3:00 PM—stay for a few minutes or the full session!
Upcoming GenAI Sessions and Workshops
Visit CTSI Events for more U of T teaching and learning events. If you have an event you would like to promote on this calendar, please complete this online form.
Plug & Play GenAI: Integrate GenAI Literacy into Your Course Today
August 14, 11am-12pm
Curious about how to help your students navigate the world of Generative AI? Join us for a hands-on introduction to the new GenAI Literacy Course Modules - a flexible, ready-to-integrate set of resources designed to boost student AI literacy across disciplines at the University of Toronto. This session will walk you through the module content, demonstrate options for import and customization in your Canvas (Quercus) course shell, followed by and open office hour to help you integrate them with your course activities and assessments.
Fall Series: Adapting Teaching & Assessment with GenAI in Mind
This three-part series is designed to help instructors navigate the evolving landscape of generative AI (GenAI) in university teaching and learning. Grounded in the University of Toronto AI Task Force’s principles and informed by current research, the series emphasizes human-centred and integrity-driven approaches.
Each session builds on the last, supporting instructors in adapting their practices to foster meaningful learning, uphold academic integrity, and develop AI literacy.
Part 1: What’s Next with GenAI?: Practical Considerations for Teaching and Learning
September 18, 1pm – 2 pm (online) – Completed
This session introduces participants to the key findings and recommendations of the University of Toronto AI Task Force Report’s Teaching and Learning Working Group, highlighting the structural changes emerging across teaching and learning due to generative AI. This report provides a comprehensive framework for understanding how AI is reshaping educational practices and offers evidence-based guidance for U of T’s institutional response.
Participants will explore current student practices with GenAI, examining both the opportunities it presents for personalized learning and the challenges it raises regarding academic integrity, effortful learning, and course competency development. The session will help instructors reflect on how these changes affect their teaching and will provide practical guidance for aligning course goals and classroom practices with GenAI’s evolving role. Participants will also explore how to build AI literacy into their curriculum so that students can use AI tools thoughtfully and responsibly.
Part 2: Adapting Assessments with Generative AI in Mind
October 21, 11am – 12:30pm (online) – Completed
In the second part of the series, participants will reflect on how assessment practices can evolve to provide meaningful checkpoints on student learning in the age of GenAI. Through hands-on activities, instructors will consider how to adapt existing assignments, ensuring that feedback and evaluation are intentionally aligned with both explicit and implicit learning outcomes and clear success criteria.
Additionally, the session will explore approaches for upholding academic integrity, including designing assessments that foster independent thought, require authentic demonstration of learning, and measure human-centred skills. Participants will discuss the importance of transparency in communicating expectations and norms for AI use, and will consider both responsible integration of AI tools into assessments and strategies to address risks related to unauthorized use. By the end of the session, instructors will be better equipped to support authentic and equitable assessments in their own teaching contexts.
Part 3: Developing AI-Literacy Activities for Meaningful Learning
November 20, 12pm – 1:30 pm – Completed
The final session focuses on equipping students with the AI literacy skills needed to engage with AI technologies critically, ethically, and effectively. Participants will discuss actionable, incremental ways to develop and integrate discipline-specific AI literacy activities into courses, while ensuring effective connections to assessments and broader course goals.
The workshop will address how to help students become both informed evaluators and responsible users of these technologies, guiding them to critically evaluate AI-generated content and understand the ethical and social implications of AI systems. By integrating AI literacy activities throughout the curriculum – whether students are actively using AI tools, analyzing AI outputs, or examining AI’s broader societal impact – instructors can create meaningful and equitable learning experiences that foster students’ future readiness in both professional and personal lives.
August: Tune into Teaching Series
CTSI’s Tune into Teaching series is for instructors preparing to teach in the upcoming academic year. These sessions are open to all new and returning faculty and librarians.
Critically Engaging with AI Literacy in Teaching and Learning
August 25, 1:30pm-3pm (online)
We will unpack key concepts and skills necessary for AI literacy and consider how it relates to traditional information literacy principles. Through a range of practical examples, participants will explore strategies to promote critical thinking and ethical use of AI tools with their students and consider how to incorporate AI literacy into their own learning activities and assessments. Participants will also have opportunities to engage in small group discussions to share experiences and collaboratively explore implementation strategies with colleagues from across disciplines.
This webinar is ideal for those in the U of T teaching and learning community across all disciplines who are interested in enhancing their own AI literacy skills and integrating AI literacy thoughtfully into their teaching practices.
Top Things to Know about Teaching and Generative AI at U of T
August 26, 1pm-2pm (online)
Generative AI is reshaping teaching and learning, but navigating this landscape at U of T doesn’t have to be overwhelming. This focused one-hour online session gives you exactly what you need to get started: which AI tools are officially supported, University guidelines, and practical strategies for setting clear expectations with your students.
You’ll walk away with concrete examples of AI-integrated assessments, ready-to-use conversation starters for discussing AI with your students, and a clear understanding of the resources available to support your teaching. Whether you’re AI-curious or already experimenting, this session provides the institutional knowledge and practical tools to help you make informed decisions about generative AI in your courses.
Generative AI Sandbox: Exploring Course Uses for Instructors
August 27, 1pm-3pm (in-person)
Join us for a hands-on workshop exploring how instructors can make use of Generative AI tools available within the University of Toronto environment to support teaching and learning. This workshop is intended for those who are intrigued by AI but may feel cautious about trying it out in your course. We will demonstrate and practice with AI tools available to instructors and students for use in course work for this fall 2024 semester. Learn to navigate AI platforms while protecting your and your student’s personal data, as we generate examples and explore use cases to support your course learning objectives.
“Can I use this content in this tool?”: Navigating the Nuances of Copyright and Generative AI
August 28, 2pm-3pm (online)
This workshop will introduce instructors to questions that arise when integrating approved and non-approved GenAI tools into teaching, using the metaphor of the “Walled Garden” to frame U of T’s protected digital environment. From there, the workshop will segue into a more focused discussion on copyright, using case studies to guide the conversation
By participating in the workshop, instructors will gain a better understanding of the copyright and licensing concerns that govern the use of GenAI tools, as well as the library eResources that are available to support GenAI use.
Join the CTSI GenAI Reading Group
This reading group is part of the broader CTSI GenAI in Teaching and Learning Commons, a Microsoft Teams online community where you can share insights, access resources, and discuss current approaches and challenges related to generative AI in education.
GenAI Dialogue Series
The GenAI Dialogue Series offers a space for faculty and staff to exchange ideas, reflect on experiments, and share both successes and challenges with generative AI in teaching. Each 30-minute session, co-led by an instructor or staff colleague, focuses on a specific theme.
The GenAI Works in Progress Series
This is a CTSI/DLI programming effort designed to encourage an ongoing, open institutional-level conversation about generative AI in teaching and learning and to create a space for experimentation, sharing and problem-solving. These one-hour presentations will be targeted towards a broad teaching and learning audience where presenters can share current questions, ideas, inquiries or works in progress with a community of peers. Understanding that many of us are still in the early stages of navigating the realm of generative AI, we want to emphasize that these sessions are not intended to showcase definitive answers or practices, rather we are interested in the questions, challenges and learning currently being explored.
There are currently no upcoming sessions scheduled. Please see below for recordings of past workshops and presentations from the series.
Past CTSI GenAI Workshops
Expand each accordion panel below to review past content. Visit the Past CTSI Workshop Recordings page for more materials.
U of T Teaching Examples
Assessment & Activity Examples
AI-Integrated Assessments
Jessica Hill, Molecular Genetics
Noa Yaari, History
Nazanin Khazra, Economics
Nazanin Khazra, Economics
Robert Bentley, Kinesiology & Physical Education
Morris Manolson, Dental Sciences
Alexandra MacKay, Rotman Management
Kenneth Yip, Cell & Systems Biology
Alexandra MacKay, Rotman Management
AI-Integrated Learning Activities
Alexandra MacKay, Rotman Management
Kenneth Yip, Cell & Systems Biology
Instructor Profiles

Elaine Khoo, Associate Professor, Teaching Stream; Centre for Teaching and Learning, English Language Development Support Coordinator, UTSC
Supporting ELL Students

Steve Easterbrook, Director, School of the Environment; Professor, Department of Computer Science, Faculty of Arts & Science, UTSG
Syllabus Statements and GenAI

Dan Zingaro, Associate Professor, Teaching Stream and Associate Chair (CSC), Department of Mathematical and Computational Sciences, UTM
GenAI in Programming

Noa Yaari, Communication Instructor, Institute for Studies in Transdisciplinary Engineering Education and Practice (ISTEP), UTSG
GenAI and Creativity

Jessica Hill, Associate Professor, Teaching Stream; Department of Molecular Genetics, Temerty Faculty of Medicine, UTSG
GenAI and Critical Thinking

Nazanin Khazra, Assistant Professor, Teaching Stream, Department of Economics, Faculty of Arts & Science, UTSG
Supporting Deeper Engagement

Robert Bentley, Assistant Professor, Kinesiology & Physical Education, UTSG
Enhancing Critical Reading & Interpetation

Alexandra MacKay, Professor, Teaching Stream Joseph L. Rotman School of Management, UTSG
Supporting Ethical AI Use
Submit an Assessment or Learning Activity
We are asking U of T instructors how they engage with generative AI tools in the teaching. As this section grows, we will include brief profiles and examples of assessments and offerings.
If you would like to share an assessment that uses generative AI, or an example of how you and your students engage with generative AI in your course, please complete this online form.
There are a growing number of generative AI tools available and the capabilities of these tools is evolving at a rapid rate. Currently, Microsoft Copilot is the recommended generative AI tool to use at U of T. When a user signs in using University credentials, Microsoft Copilot conforms to U of T’s privacy and security standards (i.e., does not share any data with Microsoft or any other company). In addition, Contact North AI Tutor Pro and Contact North AI Teacher’s Assistant Pro conform to U of T’s privacy and security standards. Please be aware that any other generative AI tool used within a U of T course or organization that has not been vetted for privacy or copyright concerns should be used with caution. If you would like to learn more about the tools available in U of T’s academic toolbox, please visit ARC’s New Tools.