Instructor Profile: Steve Easterbrook

How U of T instructors are incorporating generative AI into their teaching

Headshot of Steve Easterbrook

Steve Easterbrook, Director, School of the Environment

Course details
Title and Code: ENV194: Topics in Climate Change: Confronting the Climate Crisis
Number of students: 24
Online/in-person/hybrid: in-person

For the Winter 2024 term, Steve Easterbrook, Director of the School of the Environment, University of Toronto wrote a policy for the use of generative AI in his Confronting the Climate Crisis course. Generative AI tools are not banned but students must provide information on what tool they used and how they used it. They are also asked to reflect on their experiences using the tool for the assignment. Professor Easterbook has made his policy available (with a Creative Commons license) for other instructors to adapt and use.

Syllabus for Confronting the Climate Crisis (pdf)

Q: You haven’t banned the use of generative AI in your course but rather ask your students to acknowledge what and how they used a tool to complete an assignment. You’ve also asked that they reflect on their experience using generative AI. Why include this final component?

I think the reality is that students will use these tools anyway. I’m very doubtful that AI-detection software will work with any accuracy, and I’m concerned about students being falsely accused, especially those who are ESL or from non-academic backgrounds, who have been taught to write in particular ways that might look more like what the AI-tools produce. Instead of trying to police it, I want to encourage the students to be honest and reflective about the tools they use, and how these tools affect their learning experience. In many ways, my policy reflects what we do around plagiarism: It’s perfectly okay to use direct quotes from sources used (including quite lengthy quotes), as long as the student clearly marks the quoted text with quotation marks, and cites the source. When we do that in academic writing, we’re assessing the student’s ability to use sources to build an argument; we’re looking at the thinking they have added in how they have marshalled the quoted texts. I think we need to develop and encourage similar academic practices around AI tools, but to do so we need to teach the students to use them thoughtfully, and show their work!

Q: How have your students reacted? Have you reconsidered any aspects of this policy as a result of feedback?

We haven’t discussed the policy explicitly in class, so I don’t have much direct feedback on it (although many colleagues have told me they really like the way I’ve set up the policy). However, the topic of AI tools came up in a class discussion on how to advocate for societal change and persuade people (the course is called “Confronting Climate Change”). When asked how many of the students use AI tools to generate text, around half raised their hands, and a handful of students were clearly very enthusiastic users of these tools and very knowledgeable about what the tools can do. This is clearly our new reality!

Q: Have you heard from other instructors interested in using this policy (you created a creative commons license to allow for use and modifications) in their course?

I’ve heard lots of enthusiasm for the policy from colleagues online (I posted it on a couple of social media channels), but I haven’t kept track of whether anyone else has adopted it. I guess I’m more likely to hear from people next academic year, when people start putting their new syllabi together.

Back to Top