Frequently Asked Questions: ChatGPT and generative AI in teaching and learning at the University of Waterloo

ChatGPT and generative Artificial Intelligence in teaching and learning at the University of Waterloo

This FAQ will evolve as ChatGPT and other generative AI tools evolve (Last Updated: July 25, 2023)

In November 2022, OpenAI released a free preview of ChatGPT, a highly advanced chatbot that uses predictive text to string words together in a way that mimics human conversational patterns While generative artificial intelligence (AI) tools are not new, the quality of responses generated by ChatGPT surpasses prior AI-based writing tools, sparking debate in the higher education community about its use in teaching and learning. Many questions centre around academic integrity, and effective uses of such tools. At Waterloo we know that innovations in technology often produce innovations in teaching and learning, and this new development is no exception.  

For support with Chat GPT and other generative AI:

Course and assignment redesign

CTE Liaisons

Online course and assignment redesign

CEL’s Agile Development Team

Designing writing assignments in the context of Chat GPT and similar  AI

Writing and Communication Centre

Centre for Teaching Excellence

Strategies to encourage students to work with integrity

Office of Academic Integrity

Citational practices for ChatGPT and similar technologies Library

What is ChatGPT and generative AI 

ChatGPT and similar technologies are artificial intelligence backed chatbots that can mimic human conversations and writing (O’Brian, 2023). ChatGPT is a Large Language Model that learns the statistical structure of language, such as patterns of word usage, to generate answers based on probability distribution over word sequences (Ramponi, 2022).

As Chat GPT composes an answer, it determines the most likely word or sequence that should go next, based on the training that it has had to date. These tools can be used for a variety of tasks including drafting emails or blog posts, composing essays, and even generating, debugging, and documenting code. This technology is particularly powerful as it can mimic writing or coding styles relatively effectively, making it particularly flexible and widely applicable.

Some see this technology as the next generation of word processing tools, like predictive text, grammar checkers, or a kind of “supercharged Clippy” (the old MS Word “help” tool) (Bruff, 2022). Indeed, Microsoft is now building ChatGPT into MS Teams, Word, and PowerPoint. Rival companies will be releasing their own generative AI chatbots imminently, and tools for nearly any purpose are already available or in development. 

What are some limitations of this tool 

ChatGPT provides a response based on a prompt. The current public release of ChatGPT uses information found freely on the internet (up to 2021) to generate responses to prompts, but is not a search engine. It “learns” through access to information curated by researchers, developers, and ongoing input from its users—which may contain errors, misconceptions, or bias (Wu, 2022). Of course this field is developing rapidly, and at the time of publishing these guidelines, similar tools are being adopted by search engines like Google. Verification and triangulation of information is important if using ChatGPT, as it is unable to distinguish between important and unimportant errors (Ramponi, 2022). While it can – if asked – provide references for content found, they will often appear plausible but be incorrect (Montti, 2022).   

What are ethical considerations with ChatGPT? Do ChatGPT responses contain bias? 

The responses generated by ChatGPT can be incorrect and may include bias (Wu, 2022). ChatGPT responses can contain bias inherent within the free, large database of internet it was trained on as well as the potential bias of those reviewing and selecting the text to include in the large database of text that ChatGPT uses to create its responses. One writing specialist tested ChatGPT’s response to a prompt and found that it reinforced rather than complicated a simplistic ideological stance – unlike Wikipedia or even Google searches (Mills, 2023). The specific phrasing of prompts impacts how the AI responds, including the degree of bias in the response (Squires, 2022). Other similar AI chatbots such as Google’s Sparrow and Galactica were unsuccessful in their releases for this very reason (Wu, 2022). 

What are the privacy and security concerns with ChatGPT? 

ChatGPT evolves through its interactions with humans in the form of prompts and chat interactions. Users must provide a telephone number and email address to access ChatGPT. If users (as of February 2023, hundreds of millions) delete their accounts, OpenAI retains all the prompts they have added. As of February 2023, ChatGPT has not undergone an Information Risk Assessment at the University of Waterloo. OpenAI’s commonly asked questions page has more information on the company’s own policies. 

Is using ChatGPT or other AI tools considered cheating? 

The University of Waterloo expects that students will do their own learning, and not fabricate, falsify, or otherwise pass off others’ work as their own. There are, however, ways to use many tools ethically for learning. Using ChatGPT (or similar tools that generate text, code, or visual images) for content generation and submitting it as one’s own original work is a violation of the University of Waterloo’s Student Discipline Policy policy, but other uses of these tools may be explicitly allowed by the assignment guidelines, course outline, or otherwise specifically permitted by the instructor. Instructors should clearly communicate their expectations for use (or not) of generative AI tools in the course syllabus and assignment instructions. 

Is there a syllabus statement(s) on student use of AI-related tools? 

At this time, the University does not have standard boilerplate text for use in syllabi. Course outlines must include the current standard statements on academic integrity and Turnitin. Faculty are asked to explain in course outlines and assignment guidelines what is and is not permitted, just as we do for collaboration or use of secondary sources. 

Is there a detection tool that can determine if ChatGPT is used by students to work on assignments?  

There are many tools emerging to detect ChatGPT use. However, controlling the use of AI writing through surveillance or detection technology is not recommended; AI will continue to learn and if asked, will itself help to avoid the things its own architecture is using to detect it. The text-matching software detector we use, TurnitIn has offered a new AI detection tool as part of its suite of tools in early 2023. For the 2023 Calendar Year, as of midway through Winter term, we have turned on Turnitin’s new AI detection tool and are evaluating its accuracy. The usual Turnitin language will be deemed to cover any originality reports thus generated as well.

How can I reduce the risk of student cheating with AI? Are there types of assessments that can ensure students complete their own work and maximize academic integrity? 

The Centre for Teaching Excellence has collaborated with other academic support units to develop resources for instructors. (link) 

Evidence-based assignments and assessments can help encourage students to complete their own work and make unapproved AI chatbot use easier to identify. When adopting any approach (e.g., visual mapping or oral presentation), keep accessibility and inclusion at the forefront of redesign. Staff at CTE, CEL (for online courses), Library, and WCC are available to consult on assessment redesign.  

Inspiration for assignment redesign: 

  • Research-based assignments that require scholarly resources cited accurately, especially resources found behind a paywall through Library subscriptions 

  • Assignments that show how a work evolved over time (e.g., requiring the submission of scaffolded assignment components, or drafts, or tracking changes) 

  • Auditory or audio-visual presentations (e.g., live or asynchronous presentations) 

  • Use of social annotation tools, like university-supported Perusall, that requires students to respond to content in concert with their classmates 

  • Assignments or assessments that require content to be produced in multiple and accessible modes (e.g., create a visual or infographic and then write about it) 

  • Assignments where students are asked to include real world or personal examples, or demonstrate how the answer ties with course content or a case study discussed in class 

  • Assignments that require metacognitive skills like reflection on the content and process before, during, and after submission 

  • Assignments that are about very recent news or developments in a field

How can AI tools be used effectively for teaching and learning, and how do I start to talk about them with my students?

There may be opportunities to leverage AI to enhance teaching and learning either by teaching about AI or teaching with AI tools. Introducing AI through discipline-specific classroom conversations can encourage students to think critically about AI and its potential societal implications like accuracy, privacy and security, and bias. Activities or assessments that require learners to analyze, improve, or critically evaluate text or code generated by chatbots can help develop students’ higher order skills. 

Examples: 

  • Provide a prompt and the resulting text, and ask students to improve on it using track changes 

  • Ask students to critique and grade a properly-referenced prompt and text, and then write an improved text.  

  • Generate two different texts and ask students explain the shortcomings of each or combine them in some way using track changes 

  • Document code and test documentation accuracy with peer 

  • Facilitate authentic learning: use AI to create personalized case studies for student groups, tailored to student interests. 

  • Use ChatGPT to provide a starting summary of a debate or issue as a springboard for research and discussion – identify what it is missing.  

For more ideas, review the following resources:

  • Mollick, E. R., & Mollick, L. (2022). New modes of learning enabled by AI Chatbots: Three methods and assignments. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.4300783 

Can I require students to use ChatGPT for assessment? 

There are many possible uses of generative AI for assessment; ChatGPT is not fully accessible globally, and now has paid subscription models. If used in assessments or activities, plan alternatives. One option is to require students to use ChatGPT content provided by the instructor, instead of asking all students to sign up for this particular tool 

How should students cite AI tools if they are permitted for an assignment?  

AI, such as ChatGPT, generates text from other sources; it uses others’ ideas without proper attribution. If using permitted chatbots in teaching and learning, students should always check and verify information and cite the original sources instead of the AI. This kind of checking skill is, of course, a valuable outcome for university-level learning. Citation guides for generative AI text and/or original prompts is an emerging question for commonly used style guides; while some academic journals and associations are banning references to AI as “co-author,” the American Psychological Association (APA) and Modern Languages Association (MLA) are beginning to develop citation norms and guidelines. More information is available from the University of Waterloo Library, Writing and Communication Centre, and Centre for Teaching Excellence

Can I use ChatGPT to create my instructional materials (e.g., lesson plans, quiz questions) for my students or grade tests or assignments 

AI tools like ChatGPT have functionality that may create efficiencies for planning instruction and producing content. This will in fact become inevitable as it is integrated into the MS Office suite. There are examples of ChatGPT being used to help ideate, plan, and draft instructional materials like topics, outlines, lesson plans, content, questions, and other instruction. The University of Sydney has prepared advice about how AI can be used meaningfully by teachers and students in 2023. 

What considerations are there for AI tools for graduate students working on their theses?  

The use of generative AI that is then submitted as though it is one’s own original work would violate academic integrity policy. In addition, please  take note of the Waterloo policy on editing of graduate theses: https://uwaterloo.ca/graduate-studies-postdoctoral-affairs/current-students/thesis/thesis-editing 

How are workplaces using these tools, and how do we prepare our students to be technologically adept with them 

Over 70% of our students engage in work-integrated learning (including co-op). As our students partake in work-integrated learning, they aim to link the skills they are learning in the classroom to those that they are applying as they work with their employer/community partner. To help prepare for the future of work, and the human skills such as those in thefuture ready talent framework, Waterloo faculty, TAs, and students should incorporate the wise and ethical use of new technologies like AI and chatbots into our practices. As educators, we can demonstrate appropriate use in our teaching and assessment practices. We can model good digital citizenship and stewardship. In particular, we should build awareness of the potential for bias and misinformation when using these technologies. Ultimately, we should be working WITH this technology, teaching students how to ask questions and refine answers to leverage the power of these tools for better outcomes. In this way we foster students’ technological agility so they can: grasp new technologies with ease, apply technology to better achieve results, and advocate for the use of innovative technologies.

Who on campus is using these tools in teaching and learning? 

CTE, Teaching Fellows, and others are gathering use cases and will host sessions to share information between instructors. 

How can I get help with ChatGPT, AI tools, or assessment redesign that considers AI? 

Instructors are encouraged to contact Academic Support Units for support with AI: 

  • For support with teaching and learning, contact  the Centre for Teaching Excellence (contact via Faculty-based liaisons) 

  • For support with online courses, contact the Centre for Extended Learning (contact via the Agile Development Team) for online courses

  • For support with citations for AI-generated content, contact the Library 

  • For support with best practice tips for using generative AI in written assignments, contact the Writing and Communication Centre

Where can I learn more about ChatGPT and generative AI tools in education?  

The University of Waterloo is working on building a community knowledge base that will evolve with AI advancement. In the meantime, follow recommended resources below to monitor evolving information about ChatGPT in Higher Education.  

References 

Bruff, D. (2022, December 20). Three things to know about AI tools and teaching. Agile Learning.

Mills, A. (2023). AI Text Generators and Teaching Writing: Starting Points for Inquiry. The WAC Clearinghouse. 

O’Brien, M. (2023, February 1). Google has the next move as Microsoft embraces OpenAI buzz. Britannica.

Ramponi, M. (2022, December 23). How ChatGPT actually works. AssemblyAI.   

Squires, A. (2023, January). Developing Topics with Chat GPT [PowerPoint slides]. Avila University Writing Center.  

Wu, G. (2022, December 22). 5 Big Problems With OpenAI's ChatGPT. MakeUseOf.