Seven professors from the University of Waterloo’s Faculty of Math have received the John R. Evans Leaders Fund (JELF), one of Canada’s top research grants.
In 1997, the federal government launched the Canada Foundation for Innovation (CFI) to spur world-class research and technology development in Canada. One of CFI’s core programs is JELF, which recognizes researchers who have demonstrated excellence in their fields and their proposed project is innovative, high-quality and meets international standards.
This year, over 400 researchers across Canada received the JELF grant, including Faculty of Math professors Dr. Wenhu Chen, Dr. Freda Shi, Dr. Yuntian Deng, Dr. Mohamed Hibat-Allah, Dr. Barbara Zemskova, Dr. Hong Zhang and Dr. Victor Zhong. These researchers’ grants totalled $859,073, allowing them to reach new heights in fields from cybersecurity to oceanography.
L-R: Wenhu Chen, Yuntian Deng, Mohamed Hibat-Allah, Freda Shi, Barbara Zemskova, Hong Zhang, Victor Zhong
“GPU accelerated numerical simulations for oceanographic applications” ($160,000)
Barbara Zemskova, assistant professor of Applied Mathematics
Zemskova’s research project is focused on improving climate model predictions. They are working to improve our understanding of the physics of small-scale processes, as well as exploiting machine learning techniques for faster and more efficient monitoring of changes in the ocean.
Zemskova would like to thank the Office of Research and Math Innovation Office for their “tremendous support” in navigating the application process.
“The success of my research largely hinges on access to a large amount of dedicated computational resources,” explains Zemskova. “Such resources are needed both for testing model architecture to create more accurate machine learning models and for running many numerical experiments as many parameters will be varied to have a more complete understanding of ocean processes. Winning this JELF award provides me with such resources to accomplish these ambitious research goals.”
“Exploring many-body physics with language models” ($160,000)
Mohamed Hibat-Allah, assistant professor of Applied Mathematics
Hibat-Allah's project explores the intersection of language models, quantum simulation, and combinatorial optimization. “Think of a quantum system or a combinatorial optimization problem as a ‘sentence,’” he explains, “but instead of being composed of words, it consists of spins, atoms, electrons, or graph nodes.”
He expresses his gratitude to Janine Ouilmet and Alexander Kraushaar for their support, as well as Hiya Chhiber, Yibei Zhao, Jen Marshall, Kumudinie Kariyapperuma, and Steven Holland in the Math Innovation Office and Office of Research. He is also grateful to Applied Math char Dr. Hans De Sterck for his support. “Honestly, I wouldn’t be able to make it without the supportive environment at UWaterloo,” he says. “This success wouldn’t be possible without the support of many amazing people.”
Hibat-Allah notes that the recognition “is truly humbling and a motivation for my group to lead more explorations. This award will play a key role in my lab as it will allow my group to get advanced computing infrastructures which will speed up our research program by providing enough compute to carry out large scale simulations with the goal of uncovering the power of language models in quantum simulation and combinatorial optimization.”
"Enriching the linguistic diversity of open language models" ($168,004)
Wenhu Chen and Freda Shi, assistant professors of Computer Science
"Progressive internalization of skills for advanced language model reasoning" ($96,000)
Yuntian Deng, assistant professor of Computer Science
Despite its widespread use, some LLMs struggle with tasks that require high-level reasoning — a topic that Deng has discussed. Even the leading AI tool, ChatGPT, may miscalculate complex math equations, such as large-digit multiplication.
The main issue is the reasoning capabilities of LLMs, which is the focus of Deng’s research project. His project could advance LLM’s ability to solve complex tasks by focusing on “progressive internalization”.
“Just as children first learn to walk before running, we progressively teach large language models to internalize basic reasoning skills, such as arithmetic, before advancing to more sophisticated tasks like calculus. In this way, the model can gradually shift its focus toward higher-level reasoning once foundational skills have been internalized,” says Deng.
"Serving systems for large language models with low latency, high utilization, good scalability, and low carbon emissions" ($195,069)
Hong Zhang, assistant professor of Computer Science
"Privacy-aware language agents via post-deployment learning and adaptation" ($80,000)
Victor Zhong, assistant professor of Computer Science
Professor Victor Zhong is a highly productive early-career researcher who was named the CIFAR AI Chair and faculty member at the Vector Institute in 2024. His research is at the intersection of natural language processing and machine learning, with an emphasis on teaching machines to read natural language specifications to generalize new problems.
With the JELF grant, he will focus on creating language agents— a type of AI agent that uses natural language processing to understand its environment and make decisions — that can safeguard user privacy.
“I am honoured to receive this award and look forward to using it to build state-of-the-art AI systems to benefit all Canadians,” he says.