This information will evolve as ChatGPT and other generative AI tools evolve (Last Update: Feb 10, 2023)
In November 2022, OpenAI released a free preview of ChatGPT, a highly advanced chatbot that uses predictive text to string words together in a way that emulates human conversational patterns. While generative Artificial Intelligence (AI) tools are not new, the quality of responses generated by ChatGPT surpasses prior AI-based writing tools, sparking debate in the higher education community about its use in teaching and learning. Many questions centre around academic integrity, and effective uses of such tools. At Waterloo we know that innovations in technology often produce innovations in teaching and learning, and this new development is no exception.
For support with Chat GPT and other generative AI:
Course and assignment redesign
Online course and assignment redesign
Designing writing assignments in the context of Chat GPT and similar AI
Writing and Communication Centre
Strategies to encourage students to work with integrity
|Citational practices for ChatGPT and similar technologies||Library|
What is ChatGPT and generative AI?
ChatGPT and similar technologies are artificial intelligence backed chatbots that can mimic human conversations and writing (O’Brian, 2023). ChatGPT is a Large Language Model that learns the statistical structure of language, such as patterns of word usage, to generate answers based on probability distribution over word sequences (Ramponi, 2022). As Chat GPT composes an answer, it determines the most likely word or sequence that should go next, based on the training that it has had to date. These tools can be used for a variety of tasks including drafting emails or blog posts, composing essays, and even generating, debugging, and documenting code. This technology is particularly powerful as it can mimic writing or coding styles relatively effectively, making it particularly flexible and widely applicable. Some see this technology as the next generation of word processing tools, like predictive text, grammar checkers, or a kind of “supercharged Clippy” (the old MS Word “help” tool) (Bruff, 2022). Indeed, Microsoft is now building ChatGPT into MS Teams, Word, and PowerPoint. Rival companies will be releasing their own generative AI chatbots imminently, and tools for nearly any purpose are already available or in development.
What are some limitations of ChatGPT and generative AI?
ChatGPT provides a response based on a prompt. The current public release of ChatGPT uses information found freely on the internet (up to 2021) to generate responses to prompts. Chat GPT uses information found freely on the internet, but is not a search engine. It “learns” through access to information curated by researchers, developers, and ongoing input from its users—which may contain errors, misconceptions, or bias (Wu, 2022). Of course this field is developing rapidly, and at the time of publishing these guidelines, similar tools are being adopted by search engines like Google. Verification and triangulation of information is important if using ChatGPT, as it is unable to distinguish between important and unimportant errors (Ramponi, 2022). While it can – if asked – provide references for content found, they will often appear plausible but be incorrect (Montti, 2022).
What are ethical considerations with ChatGPT? Do ChatGPT responses contain bias?
The responses generated by ChatGPT can be incorrect and may include bias (Wu, 2022). ChatGPT responses can contain bias inherent within the free, large database of internet it was trained on as well as the potential bias of those reviewing and selecting the text to include in the large database of text that ChatGPT uses to create its responses. One writing specialist tested ChatGPT’s response to a prompt and found that it reinforced rather than complicated a simplistic ideological stance – unlike Wikipedia or even Google searches (Mills, 2023). The specific phrasing of prompts impacts how the AI responds, including the degree of bias in the response (Squires, 2022). Other similar AI chatbots such as Google’s Sparrow and Galactica were unsuccessful in their releases for this very reason (Wu, 2022).
What are the privacy and security concerns with ChatGPT?
ChatGPT evolves through its interactions with human, in the form of prompts and chat interactions. Users must provide a telephone number and email address to access ChatGPT. If users (as of February 2023, hundreds of millions) delete their accounts, OpenAI retains all the prompts they have added. As of February 2023, ChatGPT has not undergone an Information Risk Assessment at the University of Waterloo. OpenAI's commonly asked questions page has more information on the company’s own policies.
Is using ChatGPT or other AI tools considered cheating?
The University of Waterloo expects that students will do their own learning, and not fabricate, falsify, or otherwise pass off others’ work as their own. There are, however, ways to use many tools ethically for learning. Using ChatGPT (or similar tools that generate text, code, or visual images) for content generation and submitting it as one’s own original work is a violation of the University of Waterloo’s Student Discipline Policy policy, but other uses of these tools may be explicitly allowed by the assignment guidelines, course outline, or otherwise specifically permitted by the instructor. Instructors should clearly communicate their expectations for use (or not) of generative AI tools in the course syllabus and assignment instructions.
Is there a syllabus statement(s) on student use of AI-related tools?
At this time, the University does not have standard boilerplate text for use in syllabi. Course outlines must include the current standard statements on academic integrity and Turnitin. Faculty are asked to explain in course outlines and assignment guidelines what is and is not permitted, just as we do for collaboration or use of secondary sources.
Is there a detection tool that can determine if ChatGPT is used by students to work on assignments?
There are many tools emerging to detect ChatGPT use. However, controlling the use of AI writing through surveillance or detection technology is not recommended; AI will continue to learn and if asked, will itself help to avoid the things its own architecture is using to detect it. The plagiarism software detector we use, TurnitIn, indicates plans for integrating AI detection into its software. AI detection software is not currently used or recommended at the University of Waterloo, nor is there currently a statement for using AI detection software in a course to monitor student AI use.
How can I reduce the risk of student cheating with AI? Are there types of assessments that can ensure students complete their own work and maximize academic integrity?
The Centre for Teaching Excellence has collaborated with other academic support units to develop resources for instructors.
How can AI tools be used effectively for teaching and learning, and how do I start to talk about them with my students?
The Centre for Teaching Excellence is in the process of developing resources to support teaching and learning.
Can I require students to use ChatGPT for assessment?
There are many possible uses of generative AI for assessment; ChatGPT is not fully accessible globally, and now has paid subscription models. If used in assessments or activities, plan alternatives. One option is to require students to use ChatGPT content provided by the instructor, instead of asking all students to sign up for this particular tool.
How should students cite AI tools if they are permitted for an assignment?
AI, such as ChatGPT, generates text from other sources; it uses others’ ideas without proper attribution. If using permitted chatbots in teaching and learning, students should always check and verify information and cite the original sources instead of the AI. This kind of checking skill is, of course, a valuable outcome for university-level learning. Citation guides for generative AI text and/or original prompts is an emerging question for commonly used style guides; while some academic journals and associations are banning references to AI as “co-author,” the American Psychological Association (APA) and Modern Languages Association (MLA) are beginning to develop citation norms and guidelines. More information is available from the University of Waterloo Library, Writing and Communication Centre, and Centre for Teaching Excellence.
Can I use ChatGPT to create my instructional materials (e.g., lesson plans, quiz questions) for my students or grade tests or assignments?
AI tools like ChatGPT have functionality that may create efficiencies for planning instruction and producing content. This will in fact become inevitable as it is integrated into the MS Office suite. There are examples of ChatGPT being used to help ideate, plan, and draft instructional materials like topics, outlines, lesson plans, content, questions, and other instruction. The University of Sydney has prepared advice about how AI can be used meaningfully by teachers and students in 2023.
What considerations are there for AI tools for graduate students working on their theses?
The use of generative AI that is then submitted as though it is one’s own original work would violate academic integrity policy. In addition, please take note of the Waterloo policy on editing of graduate theses.
How are workplaces using these tools and how do we prepare our students to be technologically adept with them?
Over 70% of our students engage in work-integrated learning (including co-op). As our students partake in work-integrated learning, they aim to link the skills they are learning in the classroom to those that they are applying as they work with their employer/community partner. To help prepare for the future of work, and the human skills such as those in the future ready talent framework, Waterloo faculty, TAs, and students should incorporate the wise and ethical use of new technologies like AI and chatbots into our practices. As educators, we can demonstrate appropriate use in our teaching and assessment practices. We can model good digital citizenship and stewardship. In particular, we should build awareness of the potential for bias and misinformation when using these technologies. Ultimately, we should be working WITH this technology, teaching students how to ask questions and refine answers to leverage the power of these tools for better outcomes. In this way we foster students’ technological agility so they can: grasp new technologies with ease, apply technology to better achieve results, and advocate for the use of innovative technologies.
Who on campus is using these tools in teaching and learning?
CTE, Teaching Fellows, and others are gathering use cases and will host sessions to share information between instructors.
How can I get help with ChatGPT, AI tools, or assessment redesign that considers AI?
Instructors are encouraged to contact Academic Support Units for support with AI:
For support with teaching and learning, contact the Centre for Teaching Excellence (contact via Faculty-based liaisons)
For support with online courses, contact the Centre for Extended Learning (contact via the Agile Development Team) for online courses,
Where can I learn more about ChatGPT and generative AI tools in education?
The University of Waterloo is working on building a community knowledge base that will evolve with AI advancement. In the meantime, check the following recommended resources to monitor evolving information about ChatGPT in Higher Education:
Bruff, D. (2022, December 20). Three things to know about AI tools and teaching. Agile Learning.
Mills, A. (2023). AI Text Generators and Teaching Writing: Starting Points for Inquiry. The WAC Clearinghouse.
O’Brien, M. (2023, February 1). Google has the next move as Microsoft embraces OpenAI buzz. Britannica.
Ramponi, M. (2022, December 23). How ChatGPT actually works. AssemblyAI.
Squires, A. (2023, January). Developing Topics with Chat GPT. Avila University Writing Center.
Wu, G. (2022, December 22). 5 Big Problems With OpenAI's ChatGPT. MakeUseOf.