Dean Wells calls out the GenAI risks to women

Tuesday, October 1, 2024

This opinion piece by the University of Waterloo’s Dr. Mary Wells, dean of Engineering; Dr. Lai-Tze Fan, Canada Research Chair in Technology and Social Change; and Dr. Ashley Mehlenbacher, Canada Research Chair in Science, Health, and Technology Communication, appeared in the Toronto Star.

 It’s been more than a month since U.S. Vice President Kamala Harris launched her bid for the presidency and already AI-generated disinformation is working to smear her name. She’s not alone.

Earlier this year, Italian Prime Minister Giorgia Meloni was also targeted with AI disinformation. Even Taylor Swift hasn’t been immune. Her likeness has been used in the creation and sharing of non-consensual deepfake pornography on X. 

Certainly, generative AI (GenAI) technologies — which can produce text, images and video — promise new benefits to society such as automating repetitive and time-consuming tasks. But it’s becoming clear that GenAI products are being released without adequately anticipating their potential misuse, such as impersonations, deepfake porn and compromising privacy and security.

Such downsides will have an impact on everyone but are poised to disproportionately affect one group in particular: women. It’s a problem that must be addressed before we move forward.

Women’s fears about technology are realvalid and empirically supported. Consider deepfakes, where women’s faces are transposed onto others’ bodies, often for use in pornographic material.

It’s now far easier to create such images with GenAI, and as a result the phenomenon has only gotten more prevalent recently, increasing by a whopping 3,000 per cent in 2023. While it is associated with images of celebrities, most of the women targeted in pornographic deepfakes aren’t public figures. The traumatic impact on those whose images are spread is hard to overstate.

Our legal and social systems have yet to catch up. An investigation by Wired magazine into deepfake nude generators showed that images of everyday women and girls appeared to have been taken from a range of sources and uploaded to websites, despite many of those images falling under laws already on the books to protect children.

It underscores the urgency in addressing GenAI risks and demonstrates that regulation, though important, will not be enough on its own.

Regulations already on the books, for example, didn’t help Scarlett Johansson. In May, her voice became the subject of copyright discussion after she was approached by OpenAI to use her vocal likeness. She turned down the opportunity, but the company’s chatbot was released with a strikingly similar sounding voice. Its use was later paused

The company has reported another actress was hired. But laws in the U.S. protect the likeness of individuals through “right-of-publicity”, meaning that even if another actor’s voice was used, Johansson may still have grounds to sue. With the ability to create likenesses so easily and cheaply, it highlights the ongoing risk that GenAI has to upend ideas of our ownership of our likeness, particularly for women.

Another area of serious concern is the impact of GenAI on specific jobs. Automated content generation, in which GenAI is used to create content, has led to declines in demand for writing, translation and customer service jobs –  positions that are more often held by women. Those who do find employment find that clients now have greater expectations of their labour.

Unfortunately, considering the implications of how digital technology can be misused usually comes much later in the process of developing it.

AI developers instead need to integrate risk assessment that draws on real-world experiences of women into GenAI design practices to help make them more trustworthy by design.

We must ensure that the voices shaping ideation and design from the earliest stages include not only technical expertise, but also a diverse range of voices to consider the impact of technologies on the lives of Canadians.  

To make that happen, it is vital that women’s voices and experiences are included among others. The result will be the creation of more innovative, more responsive, more trustworthy tech.