Around 30% of Canadians rely on AI for personal and work use, from proofreading work emails to planning trips— but do we truly understand how AI works?
In today’s world, AI may dictate our everyday outcomes and choices more often than man-made decisions. Even the most essential sectors are adopting AI. Self-driving cars, a game-changer in transportation, use AI to sense its surroundings and control its movements. Some clinics are employing object detection and recognition models, a form of AI, to detect cancerous tumours from X-ray scans— and do it at a much faster and more accurate rate than human doctors.
Yet, most of the public is unaware of how AI makes decisions, leading to possible misuse or mistrust. One promising solution is Explainable AI (XAI) visualization, which illustrates the inner workings and performance of AI models through visualizations. Unfortunately, XAI visualizations are geared towards experienced AI users, such as machine learning engineers or model developers.
What if we could explain AI through video games? That is the vision of Yuzhe You (MMath '23), a second-year PhD student at the University of Waterloo’s David R. Cheriton School of Computer Science. Inspired by research on the positive learning outcomes of gamification, Yuzhe is leveraging interactive visualizations to make XAI more meaningful and accessible to non-technical users.
Read the full story from Computer Science to learn more.