Shaping AI: Why the humanities matter in tech innovation
Doctoral candidate considers how the humanities' rich tradition of storytelling and character exploration can be harnessed to shape the future of AI
Doctoral candidate considers how the humanities' rich tradition of storytelling and character exploration can be harnessed to shape the future of AI
By Kem-Laurin Lubin, MA ’99, PhD candidate Faculty of ArtsKem-Laurin Lubin is a PhD candidate in English at the University of Waterloo where she focuses on artificial intelligence (AI) biases and their ongoing influences on the lived experience of people, particularly the marginalized. She completed a BA Hons at the University of Ottawa and an MA in Rhetoric and Professional Writing at Waterloo. Kem is co-founder of the AI Global South Summit 2024. Learn more about her work in sustainable tech futures.
As a doctoral student teaching courses at Waterloo, I often encounter undergraduates with pressing questions about their future, many driven by deep anxieties about their career prospects. A common concern I hear is, 'What will I do if AI takes my job?'
While this worry is undeniably valid, it prompts me to consider an even deeper question: How can we, in the humanities, with our rich tradition of storytelling and character exploration, harness our knowledge to shape the future of artificial intelligence (AI)?
As the saying goes, humans are wired for stories, and we must embrace the idea that today's technology serves as the modern fireplace around which we gather to share stories. Those of us passionate about literature and literary history understand the profound impact of character development. This connection naturally leads me to reflect on my work at the university and its broader implications. What might not be immediately apparent is that AI, part of today’s techno culture, also endeavours to characterize humans within ideological frameworks, often with harmful consequences.
In my research, for example, I explore an ancient rhetorical concept called ethopoeia, also known, more colloquially, as characterization. This literary and rhetorical device is the embryonic foundation of the judicial system we know today, serving as a technique for characterizing people – think guilt and innocence as the outcomes of this seemingly insignificant device in practice. While we often think of it simply as characterization, ethopoeia is a powerful tool that has been shaping our world for centuries.
As an AI Rhetoric scholar in the Faculty of Arts, I propose that ethopoeia has evolved into what I term 'algorithmic ethopoeia.' With the outsourcing of many human activities to technology, we have also handed over the task of characterizing people to machines. Today, AI powers this process, enabling the characterization of humans through what I define as 'the mathematizing of human data for digital representation and characterization’. This is achieved through data collection, sorting, and targeting, and subjected to algorithmic procedures and decision-making protocols.
Simply put, this modern twist turns human data into digital profiles, created and controlled by algorithms. This ancient concept, now adapted for contemporary technology, can be examined through the field of Computational Rhetoric – a humanities field in conversation with mathematics and computer science. The field is well regarded and led by University of Waterloo scholars like Prof. Randy A. Harris in training up-and-coming scholars. With its deep foundation in existing fields like language and literature, computational rhetoric studies are also vital for the humanities to engage in the ongoing and often complex discussions about technological disruptions. Unfortunately, the humanities are frequently sidelined in tech conversations, but we cannot remain silent. AI-powered systems often create fictional narratives about real people, and Computational Rhetoric seeks to unravel these digital fictions.
In her famous book, Algorithms of Oppression, Safiya Noble highlights the biases these systems are programmed to characterize. Voices like hers, sadly, are often seen as disruptive, activism. I argue we need more such scholarly activism to help shape the world we value. Shoshana Zuboff’s book The Age of Surveillance Capitalism also uncovers the extensive reach of data exploitation. Intersecting with my own work, these authors highlight the importance of examining how AI characterizes and then profiles individuals. Understanding these processes allows us to advocate for more ethical AI systems that respect human dignity and promote equitable treatment.
Parallels between fiction and reality are growing more evident. With events such as the reversal of Roe v. Wade, we see real-life scenarios that echo Margaret Atwood’s The Handmaid's Tale. When it comes to surveillance, for example, Aldous Huxley's Brave New World, and George Orwell's 1984 offer dystopian views of constant observation, highlighting the impact on freedom and individuality. These works serve as a warning about the journey from fiction to the reality through modern surveillance systems, now powered by AI. They teach us about the complexities and consequences of digital characterizations.
It’s up to us, scholars and supporters, to advocate for perspectives from the humanities and social sciences to ensure technology serves humanity ethically and equitably. Whether it's philosophy, sociology, literature, or linguistics, our perspectives are essential. By integrating our deep understanding of human nature and character, we can help develop AI systems that are just and reflective of diverse human experience.
Banner illustration generated by Midjourney
Differential privacy, regulatory frameworks, education and collaboration are key solutions to building privacy-preserving technologies
Canada Research Chair in Technology and Social Change is advancing artificial intelligence with more diverse human experiences
CPI brings together leading experts to discuss open banking, election security, quantum technologies and societal surveillance
The University of Waterloo acknowledges that much of our work takes place on the traditional territory of the Neutral, Anishinaabeg, and Haudenosaunee peoples. Our main campus is situated on the Haldimand Tract, the land granted to the Six Nations that includes six miles on each side of the Grand River. Our active work toward reconciliation takes place across our campuses through research, learning, teaching, and community building, and is co-ordinated within the Office of Indigenous Relations.