Instructor Profile: Dan Zingaro

How U of T instructors are incorporating generative AI into their teaching

Head shot of Dan Zingaro wearing a black shirt and smiling

Dan Zingaro, Associate Professor, Teaching Stream and Associate Chair (CSC) Mathematics and Computational Sciences, UTM 

Dan Zingaro has integrated generative AI into his teaching by encouraging students to use these tools to write computer programming code. This allows students the opportunity to collaborate and learn in different ways while also preparing them for work post-graduation. He is co-author (with Leo Porter, University of California San Diego) of Learn AI-Assisted Python Programming: With GitHub Copilot and ChatGPT

Q: One of the basic elements of using a generative AI tool is prompt engineering. Why is it important to teach, and for instructors to also learn, effective prompt engineering habits?

A: Prompt engineering is all about writing a prompt in such a way as to elicit a helpful/accurate response from generative AI. Often, prompt engineering involves providing the generative AI with steps that it should carry out to solve the problem. Yes: we still have to direct the AI! And to do that, we still need to understand fundamental programming concepts such as how to break large problems into smaller ones, understanding state, and writing our requests in ways that naturally map onto programming constructs.

Prompt engineering is so important that some instructors are even adding “prompt problems” to their courses. In these problems, students are provided pairs of inputs and outputs, and students are asked to write a prompt that yields the correct code from the generative AI. For example, if the input-output pairs are:

5, 10
20, 40
9, 18

then we’re likely trying to multiply by 2; the student would then need to write a prompt asking the AI to write code to multiply input values by 2.

Q: Do you specify what generative AI tool students use?

A: I don’t, but I do recommend that students use a generative AI tool called GitHub Copilot. It’s like ChatGPT, but tuned specifically for working with and generating code. And it works as a plugin to software that our students already use, so it fits well with their programming workflow.

Q: Has the process of conducting code walkthroughs changed with generative AI used for programming? Has the conversation or critique in this activity changed in any way?

A: My current thinking is that our students are very unlikely these days to be writing code from scratch when they enter the workforce. Generative AI writes code. In my course CSC209H5 (Software Tools and Systems Programming), I allow and encourage students to use generative AI to help them write their code. In the past, I would have graded the code submitted by students. Students would have also worked individually. Now, I am less interested in the specific code and more interested in: do students understand the code that was generated? Do they understand how the code fits in with other code provided by me? Do they understand any performance considerations, security concerns, or biases in the code? Do they collaborate well? (Collaboration in programming has always been important. It is perhaps even more important now in the era of generative AI where the focus shifts from writing code toward orchestrating code.)

For that reason, this semester I am not directly grading the code. Rather, students work in groups to produce a video showcasing their understanding of their code. The video has student groups testing the code, walking through and explaining the most important portions of code, identifying any remaining errors in their code, critically examining their code and alternatives that they considered, and discussing how they collaborated to produce their work.

Q: Is there anything generative AI and student (or assignment) related that you would like to try next?

A: I would like to ask students to obtain multiple solutions from the generative AI and critique them. A lot of learning about programming happens by studying other people’s code (just like a lot of learning about writing happens by studying what others have written!). Now, for the first time, we have this tool that can give us as many code solutions as we want. Some will be correct, some will be incorrect, some will have good style, some will have bad style… this code diversity is likely a learning gold mine and I look forward to helping students benefit from this new learning opportunity.

There are a growing number of generative AI tools available and the capabilities of these tools is evolving at a rapid rate. Currently, Microsoft Copilot is the recommended generative AI tool to use at U of T. When a user signs in using University credentials, Microsoft Copilot conforms to U of T’s privacy and security standards (i.e., does not share any data with Microsoft or any other company). In addition, Contact North AI Tutor Pro and Contact North AI Teacher’s Assistant Pro conform to U of T’s privacy and security standards. Please be aware that any other generative AI tool used within a U of T course or organization that has not been vetted for privacy or copyright concerns should be used with caution. If you would like to learn more about the tools available in U of T’s academic toolbox, please visit ARC’s New Tools.

Back to Top