New training technique aims to ‘democratize’ powerful AI

Tuesday, December 9, 2025

Researchers at Waterloo Engineering have developed a much faster, more efficient training method to help put powerful artificial intelligence (AI) tools in the hands of many more people.

Their approach to large language models (LLMs) – advanced AI systems designed to understand and generate human language by learning patterns in how words and ideas are connected – reduces both the cost and environmental impact of building them.

The researchers spent over a year making the technology cheaper, greener and therefore more accessible – a goal they refer to as ‘democratization’ – by combining and building on previous efforts to improve training.

The result is SubTrack++, a technique that speeds up pre-training of LLM models – the first and most costly, resource-intensive step in a multi-step process – by up to 50 per cent, while still exceeding state-of-the-art accuracy.

“These are extremely large models which consume a lot of energy, so an improvement of even five per cent translates into big gains,” said Dr. Sirisha Rambhatla, a professor of management science and engineering. “Advances like these will help us all build our own LLMs in the long run.”

Go to Making powerful AI more accessible to everyone for the full story.