Teaching with Generative AI

This resource offers guides, explainers, and idea generators related to planning for and integrating generative AI into courses, structured according to four key teaching phases: before the course starts, during the first week, week-to-week, and at the end of the term.

Navigating Generative AI: Six Suggestions for Every Instructor 

Illustration of a person seated at a desk using a laptop, surrounded by various digital icons representing emails, gears, documents, and other technology-related elements.

The following suggestions have been prepared by the Centre for Teaching Support & Innovation based on engagement with U of T instructors and current recommended practice.  As you consider the impacts of generative AI on your teaching, you may wish to respond by: 

  1. Clarifying expectations with your students by discussing your expectations and providing guidelines around using generative AI tools in your course. Add clear language to your syllabus and assignments regarding allowable use.
  2. Preparing for a conversation with your students about responsible use of generative AI for learning in relation to your course and discipline.
  3.  Rethinking both learning outcomes and corresponding assessments with the potential impacts of use by students in mind. Take time for critical consideration of teaching with generative AI.
  4. Talking to your TAs about expectations for use of generative AI in relation to their role and to your expectations for appropriate use/non-use by students in the course. Consider sharing TATP’s TA-focused resource on generative AI with your TAs.
  5. Familiarizing yourself with tools that align with the University’s privacy and data protections. If leveraging the capability of generative AI, you can use Microsoft Copilot in Protected Mode to protect your data and privacy.
  6. Exploring applications of generative AI tools and their outputs to gain a better understanding of their capabilities and limitations. There are a number of workshops and resources available through the Centre for Teaching Support & Innovation.  


For more information:
 
https://ai.utoronto.ca/faculty/ 

This content is available in several formats for download for use in a range of instructor support contexts: PDF Format, Word Format, PPT Format.

Introduction to Generative AI

Before deciding whether and how to integrate Generative AI into your course, it is essential to have a solid understanding of its capabilities in supporting learning, as well as the challenges it presents. This section provides a basic overview of generative AI as it relates to teaching and learning.


What is Generative AI?

Tools that leverage generative artificial intelligence (GenAI) and large language models (LLMs) to generate new code or text (e.g., Copilot, ChatGPT, Claude, Gemini, etc.) are becoming increasingly available and are likely to have long-term impacts and on what and how we teach. This section is intended to offer some guidance around how to approach the use of such tools in your teaching practice. 

Generative AI refers to artificial intelligence that can generate new content, including text, images, and other media, based on predictive modeling. It uses machine learning algorithms, specifically neural networks, to process and learn from large datasets. Large language models (LLMs) are one class of generative AI, and they have the ability to generate human-like text. 

You can learn more about the University of Toronto’s institutionally-approved generative AI tools, including Microsoft Copilot, by visiting the Generative AI Tools section of this website.  

What Opportunities and Challenges Does Generative AI Present for Classroom Instruction & Learning?

Higher education has faced similar disruptions with previous technology innovations, including calculators, Google search, and Wikipedia. While these innovations can be disruptive to our practices of teaching and assessment, incorporating them into our teaching practice is also an opportunity to prepare our learners to live and thrive in a changing world. When intentionally leveraged for classroom instruction, generative AI technologies may also provide new possibilities for enhancing accessibility and engagement for students with varied learning needs. 

When integrating GenAI into your courses, there are several opportunities to foster an inclusive learning environment:

  • Encourage AI literacy and future readiness. As generative AI tools develop in capabilities, they will continue to shape what distinctive human skills are prioritized across disciplines and fields of work. By incorporating opportunities within courses to explore, use, and assess generative AI tools, students are better prepared to strategically engage with and critically reflect on these emerging technologies. 
  • Encourage metacognition, creativity, and critical thinking. By designing carefully constructed assessments and forms of active engagement, students have the opportunity to explore different viewpoints, self-reflect, and engage in analysis and knowledge synthesis.
  • Address barriers to equity and accessibility. Generative AI tools can be leveraged to provide multiple options for motivating and engaging learners, for representing information, and for inviting learners to express and communicate. This can create learning experiences that are personalized to students’ diverse needs and abilities

When integrating GenAI into your courses, there are several considerations to ensure equitable and responsible use, including:

  • Availability. While many generative AI tools are currently freely available, their availability could change at any time. For any third-party software that is not approved by the University or your Division, there are severalconsiderations related to privacy, security and student intellectual propertythat should be considered before asking your students to use non-approved generative AI tools. 
  • Accuracy and bias. Text created by generative AI technology may be biased and may not be correct. 
  • Academic integrity. The University is discouraging the use of tools that claim to be able to detect AI generated text.  See Generative Artificial Intelligence in the Classroom: FAQ’sfor more information. 
  • Privacy and security. A version of Microsoft Copilot is currently available to the public (and U of T students), however the public version does not have full privacy and data protections in place. U of T has access to the enterprise edition of Microsoft Copilot, which conforms to the University’s privacy and data protections, unlike the public version. Note that other publicly available generative AI chatbots like ChatGPT may not offer such privacy and data protections. For details on institutionally-approved tools, please visit the Generative AI Tools section of this website.  
  • Copyright and intellectual property. It is important to be mindful of what content is entered into generative AI platforms that do not have institutional data protections in place. Never input confidential information or intellectual property for which you do not have the rights or permissions. All content entered may become part of the tool’s dataset and may inadvertently resurface in response to other prompts. See the U of T Library’s Generative AI tools and Copyright Considerations for more information.  

Before the Course Starts

This section offers considerations on how to intentionally begin course planning with generative AI in mind. You may want to revise course-level learning outcomes and assessments, as well as develop a communication plan with students and teaching assistants (TAs).


Develop or Revise your Learning Outcomes

In light of how students could use generative AI, you may want to revise the learning outcomes from previous iterations of the course. Learning outcomes are statements that describe the knowledge or skills students should acquire by the end of a particular class, course, or program. Rather than be unchanging, learning outcomes may shift to address the larger context in your discipline: the requirements of follow-up courses, potential student career paths post-graduation, and emergent digital literacy skills needed for future readiness, where AI will be increasingly used as a partner and collaborator. 

It is good practice to specify learning outcomes that are meaningful to students’ post-educational goals and overall skill development. This may help to stimulate student interest and maintain motivation across the course. To guide your decision-making process on whether and how you will engage with generative AI, you may reflect on one or more of the following questions: 

  • What human-centered skills do I want my students to develop, and how may I articulate them in the form of a learning outcome? 
  • Can students’ use of generative AI tools align with my course learning outcomes and teaching philosophy and if so, how?
  • How may I leverage generative AI tools to facilitate deeper thinking?
  • What digital literacy skills do I want my students to develop?
Modifying learning outcomes to foreground human-centred skills 

While learning outcomes vary across courses, there are some cross-disciplinary cognitive skills that you may consider relevant to your students’ development. According to Bloom’s Revised Taxonomy of Active Verbs, an educational framework for categorizing learning objectives, there are six levels of skill and complexity that instructors may measure in relation to student learning. “Bloom’s Taxonomy Revisited” (see Figure 1), created at Oregan State University, offers a framework that reconsiders how to define each of the six levels of learning in your disciplinary context, given the capabilities of AI tools. For instance, in the “analysis” level, students critically examine and break information into parts by identifying motives, causes, and relationships. Generative AI, however, is currently proficient in many analysis-related tasks, including comparing and contrasting and inferring themes. Given the analytic capabilities of generative AI, you may want to update how you frame learning outcomes at this level, so that you are pointing to human-centered skills that do not posit an overreliance on AI tools.

To verify whether your course-level learning outcomes are measuring distinctive human skills, you may wish to reflect on one or more of the following questions:   

  • Does the learning outcome encourage students to interpret and relate to authentic problems, decisions, and choices?  
  • Does the learning outcome encourage students to engage higher-order thinking skills (critical analysis, synthesis, evaluation), which generative AI cannot fully engage in?  
  • Does the learning outcome encourage the development of robust conceptual knowledge, i.e., the “why” behind the “what”? 

The image is structured as a vertical table with six rows, each representing a level of Bloom's Taxonomy from bottom to top: Remember, Understand, Apply, Analyze, Evaluate, and Create. For each level, the table provides three columns: The taxonomy level name AI capabilities related to that level Distinctive human skills for that level The table also includes a "Recommendation" column, indicating whether to "Review" or "Amend" course activities and assessments at each level. The image emphasizes that while all course activities and assessments should be reviewed in light of AI capabilities, those at the Remember and Analyze levels may need more significant amendments. At the top of the image, there's a title "Bloom's Taxonomy Revisited" and a brief explanation of the table's purpose. At the bottom, there's an attribution indicating it's licensed under Creative Commons Attribution 4.0 International (CC BY 4.0Figure 1: Bloom’s Taxonomy Revisited Version 2.0, Oregon State University

You may consider making slight changes to clarify and foreground what distinctive human skills you are assessing, as shown in the following examples of analysis-level learning outcomes:

Example: Modifying Learning Outcomes

Pre-modified Example:

Learning outcomes that do not address distinctive human skills in relation to generative AI capabilities

Philosophy:
Students will analyze the differences between various types of knowledge (empirical, rational, testimony, revelation).

Medicine:
Students will compare and contrast how promotional health information and resources effectively and accurately present information to patient care.

Modified Example:

Learning outcomes that address distinctive human skills in relation to generative AI capabilities

Philosophy:
Students will critically evaluate and compare types of knowledge (empirical, rational, testimony, revelation) within ethical and practical contexts.

Medicine:
Students will assess the accuracy, reliability, and relevance of promotional health information and resources, identifying biases and areas for improvement in patient care.

In comparing both learning outcomes, the differences can be subtle, but powerful. For example, in the “Statistics” example above, there is a distinct call out that students will draw meaningful conclusions from data. In doing this, the instructor foregrounds the human-centered skills that are valued in the discipline, regardless of whether they will engage generative AI tools in assessments.

Including learning outcomes that encourage AI literacy 

Given that generative AI is an emerging tool, developing AI literacy may be meaningful for students, and an important part of their future skillset and entry into the workforce. By incorporating space for AI literacy skill development, students may be better able to use and critically reflect on generative AI technologies, even if they may not be able to develop AI models themselves (Laupichler et al., 2023: 2). In order to incorporate AI literacy into the course, consider following Universal Design for Learning (UDL) principles by explicitly clarifying how course-level learning outcomes relate to the topic. This approach will help students generate meaning around the course, thereby developing their effort and persistence. Rather than approach AI literacy as something that one either has or lacks, it can be helpful to consider how it relates to different categories of skills. Following Bloom’s Taxonomy, AI competencies can be categorized into four cognition domains, organized from lower to higher order thinking skills. As an instructor, you may consider starting small, asking what discreet AI skills you could meaningfully integrate into the course in the form of a learning outcome, in alignment with the other course-level learning outcomes (Almatrafi et al., 2024): 

  • Know & understand: Ability to explain the basic functions of AI and how to use AI applications. 
  • Use & apply: Ability to use and adapt AI tools to achieve an objective.  
  • Evaluate & create: Ability to analyze the outcomes of AI applications critically. 
  • Navigate ethically: Ability to understand and judge ethical issues related to generative AI, such as privacy, bias, misinformation, and ethical decision-making. 

Example: AI Literacy Learning Outcomes

Philosophy: Students will critically analyze and evaluate the ethical implications raised by generative AI technologies.
Relevant AI literacy skill: evaluate & create, navigate ethically 


Medicine: Students will assess the potential benefits and limitations of using generative AI systems for medical diagnosis and treatment planning.
Relevant AI literacy skill: evaluate & create 


Statistics: Students will interpret AI model outputs and performance metrics in real-world applications.
Relevant AI literacy skill: use & apply, evaluate & create

Design or Revise your Assessments

Once you have revisited your course-level learning outcomes, the next step would be to design or revise your assessments, in alignment with the learning outcomes. In designing your assessments, you may wish to discourage or encourage generative AI, depending on whether you find that students’ use of generative AI tools would disrupt or support their achievement of the course-level learning outcomes.

If you plan to discourage generative AI use, you may want to:
  • Require citations and check for accuracy. Requiring specific citations and page numbers may discourage students from relying on generative AI.  
  • Focus assessment materials and prompts on topics local to the course. This approach can help to promote student engagement, and in addition it would be more difficult to generate a relevant answer using generative AI. Also make clear that tests and exams will require mastery of work completed for assignments. 
  • Add drafts or revise-and-resubmit components to assessments. By scaffolding assessments with iterative feedback, students will be encouraged to revise assignments in direct relation to peer or instructor feedback.  

Example: Health Promotion Assignment

Assignment learning outcome:  Students will assess the accuracy, reliability, and relevance of promotional health information and resources.

To discourage the use of generative AI, students will be asked to:

  • Specifically engage with copyrighted promotional health information from lectures 3 and 4 in order to analyze the material’s accuracy, reliability, and relevance to patient care.  
  • Provide citations in APA format with page numbers in order to demonstrate how they developed their analysis. 
  • Undertake two rounds of peer-reviewed feedback prior to the deadline. 
 
If you plan to encourage generative AI use, you may want to:
  • Authorize students to use generative AI as an educational tool in specific, constructive ways. Consider allowing students to use AI tools as a study aid and reviewer, to debug code and explain errors, to brainstorm ideas and arguments, or to edit writing.  
  • Include written reflections on how students’ used or engaged with the tool in assessments. For instance, you may ask “What was your purpose in using GenAI to complete this assignment?” (Dobrin 2023) and incorporate an evaluation of that reflection into the final grade. 
  • Incorporate citations when GenAI is used. Consider asking students to cite generative AI tools when used, which could include the AI prompt used. You may consider sharing the University of Toronto Libraries resource, “Citing Artificial Intelligence (AI) Generative Tools (including ChatGPT).”
 
Examples of using generative AI as an educational tool in assessments:
  • Critique and improve AI-generated outputs. Consider asking students to critically analyze, fact-check, and evaluate outputs generated by AI tools, including texts, equations, or code. You may provide a prompt and the output text, or you could show students how to prompt the tool to generate a response.  Individually or with peers, students can then assess and improve the response for accuracy, depth, and nuance. For example, you may ask students to answer: What important knowledge did they learn from the class that the AI missed?; What is a more nuanced or correct answer or explanation, compared to the GenAI output?; What aspects of the writing are compelling, misleading, or redundant?
  • AI-assisted problem-solving. For problem-based assessments, you may consider asking students to leverage generative AI tools to generate potential solutions, brainstorm ideas, and receive clarification. You may wish to invite students to use generative AI to write or expand on code or synthesize data sets. From there, you could prompt students to demonstrate their own understanding by evaluating and refining AI-generated content. 
  • AI-assisted writing. For writing assignments, you may consider allowing students to use generative AI tools to help with ideation, outlining, or drafting. However, you may devote a portion of the evaluation to having them substantially revise and refine the AI-generated content, to reflect their critical thinking and writing skills.
  • Adding creative elements to assignments. You may consider inviting students to use generative AI tools to add more creative elements to their work, such as using generators for images that they add to slide presentations, or to include infographics. Students could also use AI text generators to create draft scripts for videos in which they demonstrate key learning outcomes. Harnessing generative AI in this way can provide opportunities for students to demonstrate their learning in different modalities. 

Example: Data Analysis Assignment

Assessment learning outcome: Students will visualize and interpret data using appropriate statistical methods. 

To encourage Generative AI use, students will be asked to:  

  • Draft initial code for data analysis and visualization, using Copilot for assistance.
  • Identify potential errors or issues with the coding assistance received from Copilot, and suggest possible solutions, alternative approaches, or areas to investigate. 
  • Reflect on how they engaged with generative AI tools, what benefits or challenges came up, and how this use augemented their problem-solving skills.
Revising assessments to reduce learning barriers

Regardless of whether you choose to allow the use of generative AI in assessments, you may consider using Universal Design for Learning (UDL) principles to proactively enhance access and build student agency in their learning. This does not eliminate the need for specific academic accommodations for students with disabilities. Rather, this closes the gap between student needs and instructional design, and offers a ramp to better facilitate academic accommodations for students with disabilities. When designing or revising your assessments with generative AI in mind, consider engaging in one or more of the following UDL strategies to reduce learning barriers:  

  • Design options for various forms of assessments with generative AI. Given that there is no one means of action and expression that is optimal for all learners, you may wish to provide space for assessment choice, including those that involve generative AI-usage and those that do on. For instance, students may be able to show what they know through written submissions or visual representations, while engaging with pertinent tools 
  • Scaffold assignments so that students are working towards a final product for submission. Whether or not generative AI is part of the assessment, this effective approach benefits the students’ writing and learning and also creates authentic conditions that are more likely to deter unauthorized use of generative AI. 
  • Align with real-world tasks. Encourage students to demonstrate their knowledge and skills in a way that is significant, meaningful, and aligned with real-world tasks. Ask real questions that stem from current debates in your discipline, and let students know that you expect engaged critical thinking that is appropriate for the level of your students and your discipline. Encourage speculation based on evidence and reasoning, not just compilation of existing information or expression of unsupported personal opinion.  
  • Be Explicit about Expectations. Clarify and decode instructions and expectations by providing clear cues and prompts. You may provide rubrics and grading criteria to ensure students know what is expected of them in the assessment, and whether and when generative AI tools can be used. This may also reduce inappropriate or misguided peer-peer communication. See the “week to week” section for further examples on ways to provide clarity and purpose to students about assessments as they are introduced. 

Develop a Communication Plan with Students and TAs

Once you have decided on your course-level learning outcomes and assessments, you can determine your course policies and plan how you will communicate them. Your students will have varying levels of knowledge of generative AI, and many will look to you for guidance on what they are and are not allowed to do and how these tools will impact their learning.  

The University of Toronto has created sample statements to include in course syllabi and course assignments to help inform students what AI technology is, or is not, allowed in the course. Visit resource on Generative Artificial Intelligence in the Classroom: FAQs to access the most updated sample statements. In addition to providing an explicit course policy statement on generative AI for your syllabus, you may want to intentionally plan how you will share these expectations to students in class and on Quercus. Consider one or more of the following strategies:  

  • Engage students in discussion about the policy discussions, to explain the rationale and how it relates to course-level learning outcomes. 
  • Identify best practices for how to use and cite the use of generative AI tools, if they are permitted. 
  • Identify relevant campus resources for students to support their learning skills development while using generative AI, thereby encouraging academic integrity. Potential student-facing resources that you may add your syllabus include: 
  • Develop a plan for how you will communicate to your course teaching assistants their roles and responsibilities: How will they evaluate assessments and communicate course policies regarding generative AI use? 

Example: GenAI Policy Syllabus Statement

A goal in this course is to teach students how to express their knowledge and skills around [insert course topic], which takes place in in-class interactions, online discussion forums, and in written essays and exams. Generative AI can serve as a useful resource for students, by providing tutoring support for writing and analysis.  

In this course, we will use Microsoft Copilot to engage in critical thinking and writing activities and assessments. Students are expected to use Microsoft Copilot for specific aspects of writing assignments and must include with every assignment a short reflection on how they made use of the generative artificial intelligence tool in the development of their assignment. No other generative AI technologies are allowed to be used for assessments in this course.  

Students may not use artificial intelligence tools when taking tests in this course, but students may use generative AI tools for other assignments as indicated. If you have any questions about the use of AI applications for course work, please speak with the instructor

During the First Week

After developing your course-level learning outcomes, assessment structure, syllabus, and communication plan, you are ready to begin preparing for your first class In this first week, it can be especially useful to ensure that students are aware of course policies and expectations regarding generative AI.  


Communicate with Students about Expectations regarding GenAI Use

The first day is an important opportunity to model how you hope and expect that classes will proceed throughout the course. Building a sense of community through active participation around course policies will help set a tone that supports responsible use of generative AI. 

Following Universal Design for Learning (UDL), you may consider providing options for recruiting attention and engagement, thereby optimizing what is relevant, valuable, and meaningful for each learner. In consideration of this, you may want to explicitly communicate with your students about generative AI expectations, drawing on one or more of the following strategies:  

  • Create a community agreement. Ask students to collaboratively reach consensus about generative AI in course activities and assessments. See the CTSI resource for more suggestions on building these community agreements. 
  • Explain what your policy is and why. Rather than reiterate the language of the policy statement on your syllabus, you may wish to elaborate on what reasons you chose this policy. How does that policy encourage students to effectively reach the learning outcomes of the course? 
  • Facilitate debates. Arrange a classroom debate about generative AI as it relates to your course topics and/or themes.  
  • Create space for discussion and reflection around generative AI policies and expectations. Prior to explicitly discussing the policies, you may wish to create space for a broader discussion on learning and generative AI. Active learning activities like low-stakes writing, reflective writing, think-pair-share, and jigsaws can be effective ways of generating and recording ideas.  

Example: GenAI Policy Discussion Prompts

To initiate collaborative reflection around generative AI policies and expectations, the first week of class may begin with an ungraded discussion exercise, centred around the following guiding questions: 

  1. Have you used generative AI? For what purposes?  
  2. What is your familiarity level with generative AI tools?  
  3. What is our course policy on generative AI use, and what do you think is the reasoning behind it, given the course learning outcomes?  
  4. What is an example of when you were impressed with/disappointed in output from generative AI?

Demonstrate how to Responsibly use GenAI Tools

If you are choosing to encourage or allow students to use generative AI, you may wish to begin the course with a demonstration of how to use relevant institutionally approved tools. While the interactive, chat functions of generative AI can be engaging for students, making the most of these tools can require time and patience for both instructors and students. In line with Universal Design for Learning (UDL), students may benefit from there being options for how they may engage in information processing and visualization.

Given that the use of generative AI tools may be relatively new for some, and that all learners have diverse abilities in summarizing and categorizing information, you may wish to consider one or more of the following: 

  • Offer step-by-step demonstration of prompt writing. Rather than only offer a general introduction to generative AI, you may wish to model how students could responsibly use relevant tools for upcoming assessments in the course. You may want to spend focused time on showing what makes an effective prompt. A prompt is natural language text describing the task that an AI agent or chatbot should perform. Prompt writing (sometimes known as prompt engineering) is the process of structuring an effective prompt that can be interpreted and understood by the AI system. To learn more on prompt writing, see our Tool Guide under “How can I prompt with Copilot?.” 
  • Create collaborative space for experimentation and feedback. Students have varying levels of experience with generative AI tools, and the first class can be a great space to gauge their level of familiarity. In addition to providing a demonstration, you may wish to encourage your students to share helpful tips and reflections as they independently experiment with the tools.  
  • Draw attention to learning resources that support the responsible use of generative AI. When students know what resources are available to them, they will be more likely to find ways to overcome academic challenges. Even if it is mentioned on the syllabus, you may wish to model to students how they may connect with relevant academic supports across the University of Toronto – including writing centres and learning strategists. By doing this, students may be more likely to responsibly use any permitted generative AI tools.  

Example: Demonstration of Microsoft Copilot 

Currently, Microsoft Copilot is the recommended generative AI tool to use at U of T. When a user signs in using University credentials, Microsoft Copilot conforms to U of T’s privacy and security standards (i.e., does not share any data with Microsoft or any other company). 

To encourage students to use Copilot responsibly, you may provide a step-by-step demonstration in the first class: 

  • Step 1: Demonstrate the full sign-in process in protected mode, and how to verify that you are in protected mode. 
  • Step 2: Demonstrate how to prompt in Copilot, using examples that are relevant to course assessments and activities. 
  • Step 3: Provide examples of how to critically analyze prompt output. 
  • Step 4: Create space for students to collaborate and experiment using the tool 
  • Step 5: End the demonstration by specifying at what stage(s) in assignments and activities students will be permitted to use Copilot.  

For information on other approved tools, please visit CTSI’s Generative AI Tools. 

Communicate with TAs about your Expectations regarding their GenAI Use

When connecting with your course teaching assistants (TAs) during the initial course meeting, it is good practice to communicate whether and how your course will engage or limit the use of generative AI. To clarify TAs’ roles and responsibilities, as well as your expectations, you may wish to consider discussing one or more of the following topics: 

  • Grading and rubrics. Communicate with teaching assistants whether and how grading rubrics will be adjusted to account for generative AI capacities, so that human skills are being prioritized for evaluation. Hear from TAs about their prior experiences in grading assessments since generative AI tools have become more available; their insights may be a useful resource as you consider the grading protocol of your course. 
  • Student communication plan. Provide suggestions to teaching assistants on how they should communicate with students about course generative AI policies and recommended practices. 
  • Tools training. If generative AI tools are part of the course, provide training to teaching assistants about how to use the tools. Consider showing them how they may model the responsive use of generative AI tools, if their contract involves student interaction in tutorials, labs, or office hours.  
  • Check-in plan. Discuss a plan for course check-ins, so that there is an open line of communication. By doing this, teaching assistants will have a clear idea of how they may raise any questions or concerns that come up regarding generative AI expectations, protocol, and policies. 

Example: Instructor-TA Team Meeting Plan

During the initial instructor-TA team meeting, the instructor and teaching assistants will review and sign the Description of Duties and Allocation of Hours (DDAH) forms. This is a good opportunity to clarify and discuss the responsibilities and communication protocol for each teaching assistant. 

The instructor will organize the conversation around the following questions: 

  • How should TAs handle student questions related to use of generative AI in the course? 
  • How do assessments address the potential for generative AI use? 
  • What is the protocol if a TA is concerned about a student using an unauthorized generative AI tool in an assessment? 
  • How can TAs encourage student participation in discussions related to generative AI and course policies? 
  • What potential challenges might arise due to the course size and generative AI use? How can TAs and the instructor collaborate to address these challenges effectively?

Week to Week

As you move through the weeks of the course, it is essential to maintain open communication with students about generative AI. This section will outline how to introduce assessments and integrate discussions about AI on a weekly basis, helping students understand the purpose of assessments and whether and how they are permitted to use AI tools. By checking in and communicating regularly with students, they will be less likely to seek out generative AI for (mis)information.  


As Assessments are Introduced, Outline their Rationale and Procedures 

Since generative AI is new technology and its allowed uses will vary across courses, students will require clear guidelines, reiterated throughout the course.  The following strategies will ensure that your communication about assessments is accessible to all learners, aligning with a Universal Design for Learning (UDL) approach. Consider doing one or more of the following: 

  • Communicate the value of the assessments and what students can gain from completing them in relation to course-level learning outcomes. If generative AI tools play a role in completing the assessments, you may show students again how to engage with them and why. You may alternatively encourage your teaching assistants to take on that role, if they are running tutorials. 
  • Review assignment instructions in class and provide an opportunity for questions. Clearly explain the extent of allowed AI use for the assignment, and engage students in a conversation about why you chose to encourage or discourage the use of generative AI, and how generative AI may or may not help them with the assignment. 
  • Include an integrity statement for students to complete when submitting their assignments, affirming their adherence to the course’s Generative AI policy.
  • Share a “ready to submit” summary assignment sheet or checklist to guide and motivate students through the learning process, so that they know what to submit and when they are done (Bowen and Watson, 2024). Given the choice between harder and easier work, students will be more likely to use generative AI responsibly if they understand the value of the added discomfort. Consider addressing the following questions on the assignment sheet or checklist: 
    • Purpose: What skills or knowledge will I gain? How will I be able to use this? 
    • Task: What needs to be submitted? Is there a recommended process? When is this due? Where can I do this work and with whom? 
    • Criteria: What are the parts? How will I know what’s expected? 
    • Process: Most assignments require processes that are more obvious to faculty than to students; specify when and which AI might be a useful tool and how it may enhance learning.

Example: Assignment Checklist-Problem Set

This assignment will take roughly 75 minutes. In order to complete the assignment, undertake the following sequence of steps:

  • 10 minutes: Read the chapter quickly and take notes on paper. 
  • 20 minutes: Try all the first 15 problems on your own. Skip any problems that get you stuck. 
  • 10 minutes: Go back and work through the problems in detail. If you get confused, use Microsoft Copilot to get help.  
  • 5 minutes: Take a problem you are confident in sand ask Microsoft Copilot for a solution. Whose answer is better? 
  • 15 minutes: Check your work and finish 
  • 5 minutes: Rewrite your notes about the chapter. What have you learned? Make a mind map connecting the key concepts.  


Adapted from:
Teaching with AI (Bowen and Watson 2024, 195) 

Use GenAI to Support Student Engagement and AI Literacy

With intentional planning on the part of the instructor, generative AI may offer novel opportunities for personalized practice, tailored to students’ needs. 

While students may have used generative AI, they may not be familiar with best practices for using AI tools to enhance knowledge retention and critical thinking. If your course has exam components or introduces students to new conceptual material, consider adapting and sharing sample prompts from CTSI’s AI Virtual Tutor – Effective Prompting Strategies” resource.

Following Universal Design for Learning (UDL), you may wish to offer multiple means of engagement; by varying forms of involvement and interaction, students may more likely be motivated to apply their knowledge. To extend or transform in-class engagement, you may consider using generative AI for one or more of the following:  

  • Learn through tutoring. You may use generative AI tools to engage students in metacognitive reflection, whereby they identify gaps in their knowledge, consider alternative perspectives, and establish connections within complex bodies of information. This may encourage students to self-regulate, sustain effort, be goal-directed, and monitor their progress in learning. 
  • Learn through simulations. You may wish to create AI-based scenarios to serve as controlled spaces for applying knowledge in a low-stakes context. In role playing, the student may assume the identity of someone else; in goal playing, the student maintains their identity while applying their knowledge and skills. In these spaces, the AI may play the role of mentor while also creating the narrative set-up. 
  • Learn through critique. AI can provide students with multiple “peers”, prompting the student to help the “AI student” understand class material. For instance, students can critique whether an AI-generated scenario applied a course concept correctly, thereby giving space to demonstrate their knowledge.  
  • Provide multiple examples and explanations. Generative AI I tools can be used to generate a wide variety of examples related to a given topic, to model and problematize a thought process, and to offer alternative explanations. In all these cases, you may consider creating discussion space to critique the AI-generated output, thereby supporting students’ analytic skill development. 
  • Gather formative feedback on generative AI use in the class. It can be useful to collect mid-term feedback on the course, to gauge and respond to student experiences with generative AI. These evaluations may be used to make adjustments to the course that will affect the rest of the semester. For instance, you may ask students to submit exit-ticket responses at the end of a class, or one-minute papers, where students provide responses to class activities or assignments (see Angelo & Cross, 1993). 

Adapted from: Instructors as Innovators: A future-focused approach to new AI learning opportunities, with prompts,
Mollick and Mollick, 2024.

Example: Think-Pair-Copilot-Pair-Share

Think-Pair-Share (TPS) is a cooperative structure in which partners privately think about a question (or issue, situation, idea, etc.), then discuss their responses with one another. By incorporating generative AI into the activity, students will be exposed to additional perspectives to critically engage with.  

  • Think: Introduce the topic and encourage students to brainstorm as many ideas as possible, without the use of generative AI 
  • Pair: Have students pair up with a partner to share their thoughts 
  • Copilot: Ask students to individually conduct a search on Copilot to find more information on the topic, evaluating its output 
  • Pair: Have students pair up again with their partner to evaluate the examples and facts they found 
  • Share: Students share what they found with the whole class

Adapted from: Dillard, 2022

At the End of the Term

The end of the term is a valuable time to learn from students’ experiences, using those insights to guide how you will move forward with future iterations of the course. You may consider connecting with your students and TAs around how the course engagement with generative AI. 


Collect Student Reflections and Feedback

In addition to collecting mid-term course feedback, you may also consider creating space for informal feedback and discussion at the end of the term. By doing this, you can directly learn from students and teaching assistants about how you may adjust the course for future teaching. 

To collect feedback and encourage reflection on generative AI’s application to teaching, you may consider one or more of the following: 

  • Reflective writing activity about generative AI. Incorporate an assignment or optional end-of-term activity, where students share their experiences and challenges using generative AI to support their learning in the course. To guide students’ reflection, consider offering prompts, such as: 
    • How did you use generative AI tools in the course? What were the benefits and challenges? 
    • What strategies did you employ to effectively use the output of these tools? 
    • What suggestions would you give to a student taking the same course next semester, so that they get the most of using these tools? 
  • Reflective activity using generative AI. Moving a step further than a conventional reflective writing activity, consider inviting students to use generative AI to dialogue with their thoughts. For instance, you may provide prompts to students, so that they have a “conversation” with Copilot, where it asks progressive questions and feedback. From there, you may invite students to critique and expand on the summaries from Copilot, strengthening their AI literacy. 
  • Gather feedback from teaching assistants. Since teaching assistants often work directly with students, it will be valuable to gather their feedback from the semester on how course activities and assessments—including those that engaged generative AI—impacted student learning. Consider using anonymous feedback methods, so that TAs may comfortably share their concerns. If having a conversation, consider preparing the meeting with guiding questions about their experiences, strategies, and suggestions for improvement. 

Example: Dialoguing with Copilot for Course Reflection 

Activity learning outcome: Students will evaluate and reflect on the use of Microsoft Copilot throughout the entire semester to support their course-level learning goals. 

5 minutes: Personalize and insert one or more versions of the following prompt into Microsoft Copilot, so that you can have a conversation about your experiences and challenges in the course: 

You are a helpful mentor, guiding me, an undergraduate student in [insert course name] to reflect on benefits, challenges, and lingering questions from a semester-long course, where generative AI tools were used to encourage critical thinking and idea generation. As a helpful mentor, you will dialogue with me (the student), asking thoughtful questions to help me identify the benefits and challenges that I faced using generative AI as a learning tool, as well as any suggestions I have for the instructor and any future students taking the course. You will ask me one question at a time, and you will not ask a further question until I provide a response. You will also ask follow-up questions when my answers are unclear.  

20 minutes: Interact with Microsoft Copilot through a back-and-forth dialogue. Save a copy of the dialogue, to be submitted to the instructor.  

15 minutes: End the activity by writing a short reflection on the benefits, challenges, and lingering questions that come up, based on your use of generative AI as a learning tool for [insert course topic]. 

Reflect and Plan your Next Steps in Teaching

Student evaluations and informal feedback can serve as very useful resources for your long-term development as an instructor. By drawing on these responses, as well as you own observations, you can develop a plan for your next steps. 

There are many ways that you can structure your plan for refining your teaching with generative AI for future courses. As you organize informal feedback, formal evaluations, and your own observations, you may consider following Universal Design for Learning (UDL) Plus-One Approach: What moments in the course were “pinch points,” where your incorporation of generative AI didn’t go the way you anticipated? By starting with addressing these moments, you will move towards reducing the most significant barriers and finding the most important areas of your course where you can address learning variability. You may consider reflecting on one or more of the following topics:  

  • Assessing student learning. Has there been any notable changes in student engagement or learning since incorporating generative AI into my teaching? 
  • Refining assessments. Which assessments worked well, and which should I focus on redesigning, so that students’ skill-building is prioritized? 
  • Evaluating teaching strategies. Which strategies have been most helpful in helping students to critically and responsibly engage with course material and the use of generative AI in your discipline? 
  • Policies and guidelines. What modifications can I make, to further address ethical and academic integrity considerations? 
  • Share Ideas. How may I engage with colleagues in my discipline and across U of T? How may I share approaches that integrate best practices in assignment and assessment design in the context of using or avoiding the use of generative AI? 
  • Professional development. By leveraging generative AI, what new skills have I acquired, and what new areas I have identified?

Resource: CTSI Consultations 

As you conclude a course, reflecting on your experiences and student feedback can guide your next steps as an instructor. In addition to teaching dossier reviews, CTSI offers consultations to instructors in areas related to teaching and generative AI: 

  • Teaching strategies in the context of generative AI 
  • Enhancing student engagement through generative AI tools 
  • Course design, development, and review 
  • Revising learning outcomes to address AI literacy 
  • Using generative AI educational technology such as Microsoft Copilot for in-person, online, and hybrid classrooms 
  • Interpreting course evaluation data 
  • Research on pedagogical topics (Scholarship of Teaching and Learning or SoTL) related to generative AI and learning 
Back to Top