Tech adoption ‘moves at the speed of trust,’ Dean Wells says

Tuesday, October 28, 2025

This story by reporter Terry Pender first appeared in the Waterloo Region Record.

The federal minister of artificial intelligence and digital innovation likes what the University of Waterloo’s dean of engineering says about building trust and transparency in how Canada adopts AI.

So the minister, Evan Solomon, has appointed Dr. Mary Wells, dean of engineering at UW, to the federal AI task force that will write a federal policy on how Canadian industry deploys the transformative technology.

Wells is focused on building safe AI systems and public trust.

“The minister said something I really like, he said: ‘Tech moves at the speed of innovation, adoption moves at the speed of trust,’” said Wells. “I really like the way he phrased that, and I thought it was very true. So, adoption and trust will go hand-in-hand.”

Wells was appointed to the 26-member task force, in part, because of UW’s TRuST network created in response to the post-pandemic rise of anti-vaxxers and disinformation about medical science. TRuST stands for Trust in Research Undertaken in Science and Technology, includes public lectures on subjects like the safety of genetically modified crops and disinformation on social media.

The task force members and interested individuals have until Oct. 31 to send in written submissions.

“They have already received more than 5,000 submissions,” said Wells. “So, it is clearly something the public is engaged in.”

The minister provided task force members with five pages of guiding questions to help focus their work. The questions cover research and talent, accelerating adoption in industry and government, commercialization, scaling champions and attracting investments, building safe AI systems and public trust in Ai, building enabling infrastructure, education and skills, and security.

Wells is focused on building safe AI systems and public trust.

One idea is to have a closed-off place where new AI technologies can be safely tested before they are made publicly available — call it an innovation sandbox, says Wells. 

“I think that is a very concrete thing we can do that would lead to a lot of innovation in this country,” said Wells.

How AI gets integrated in the workflows of the construction sector will be different from the ways AI is deployed in finance and health care.

“So we can’t just do a broad-based approach, we have to do a very specific approach, sector by sector,” said Wells.

If the people using the technology are included in developing how it is used, that will increase transparency and trust. That is the approach taken by Edith Law, the first Google Research Chair on the Future of Work and Learning and the executive director at UW’s Future of Work Institute.

“Everybody cares about this, everybody wants Canada to be a leader, everybody wants Canada to do the right thing for the world, and we are a leader in trust and technology in many areas,” said Wells.

“So, I think we can be a leader in the world, in this area.”

The 26-member AI Strategy Taskforce comes at time when the technology makes international news and generates controversy almost every day. Meta, the owner of Facebook, Instagram and WhatsApp, announced a $27-billion campus of data centres that will cover 1,700 acres of land. The electricity and water consumption needed for AI data centres will increase greenhouse gas emissions and worsen the climate crisis, say critics.

University of Toronto physicist Geoffrey Hinton, who is called the “Godfather of AI,” won the Nobel Prize in physics this year. Hinton used the spotlight to call for more research and caution, saying AIs could become smarter than people and turn against humanity. And nobody knows how to prevent that, he said. Hinton resigned from Google so he could publicly speak about the dangers of AI.

Marcel O’Gorman, who heads the Critical Media Lab at the University of Waterloo, is concerned the AI task force is too focused on adoption and economic growth. There is not enough attention on the environmental and ethical issues, he says.

The task force was appointed in September, submissions close at the end of October and the report is due at year’s end. That’s far too fast, says O’Gorman. He created a post for students, instructors, researchers and administrators that talks about the negative impacts of generative AI on teaching and learning, especially in the humanities.

They can use generative AI all they want in microbiology, computer science, chemistry and engineering, he says, but it’s not needed, and is harmful, in English, history and political science.

“We didn’t ask for this, this is a serious disruption of our educational values, and we want to mark this moment, let it be known this is problematic for us, and we are not going to blindly adopt generative AI,” said O’Gorman.