Meet Graham, Waterloo's new supercomputer

The latest technology making an impact

If Wes Graham, the professor, could meet Graham the supercomputer, he’d be filled with awe — and a pretty serious case of déjà vu.

Graham, known as the University of Waterloo’s “father of computing,” would be amazed to discover that Graham the computer has 50 billion times more power than the “super” computers of his day.

The déjà vu would have been seeded 50 years ago, when Graham, the first director of Waterloo’s computing centre, stood in the newly built math and computer building’s “Red Room.” The room was itself a bold statement with its bright red tiles and walls. But what lay inside was even more impressive: an IBM 360 Model 75 computer.

Hailed at the time as the biggest computer in Canada, it was an impressive acquisition for a university that was only 10 years old.

But as decades passed, the evolution of microchips led to computers that packed ever-more powerful punches in ever-smaller boxes. In 1999, the same year that Graham died, the Red Room was officially closed. That year, the first BlackBerry email pager also made its debut. The era had arrived when you could carry a computer that was as powerful as the IBM 360/75 in your back pocket.

New supercomputer housed near historic Red Room 

The red room taken up with an old supercomputer

The Red Room, which opened in 1967 in the centre of the new Mathematics and Computer Building, was home to the IBM 360 Model 75 — the largest computer in Canada at the time. The room, which was designed with large windows along the upper perimeter, quickly became an attraction, drawing countless visitors who wanted to view the facilities from above. PHOTO: University of Waterloo Archives, January 1984.

Fast forward to today. History has repeated itself. In the area of the building where the Red Room once was, now stands a modern $17-million supercomputer that bears Graham’s name.

Graham the supercomputer is sleek and black. It is comprised of about 16,000 kilograms of computing equipment packed into about 60 refrigerator-sized units, holding on the rows of racks the equivalent of 33,000 modern-day processor cores.

At its unveiling earlier this year, it, too, was hailed as the biggest computer in Canada.

“It is amazing to see history repeating itself and rhyming a little bit,” says Scott Campbell, director of the Centre for Society, Technology and Values, and curator of the University of Waterloo Computer Museum.

Any comparison between today’s Graham and the IBM 360/75 quickly turns into a mindboggling visualization of how the power of computing has grown over 50 years. Upon installation, the IBM 360/75 had a total memory capacity of about one megabyte (MB).

To put that in context, an average smartphone has at least two gigabytes (GB), which is the equivalent of 2,000 megabytes. A high-end desktop computer today might have 24 GB, or 24,000 times the capacity of a computer that once filled an entire room.

Graham has 50 billion times more power than 1967 supercomputer

But even the modern desktop computer is nothing compared to Graham. It has 50 petabytes of memory capacity. That’s the equivalent of 50 million gigabytes, or 50 billion times more than the country’s biggest computer in 1967.

“It is a hefty beast,” says Scott Hopkins, a University of Waterloo chemistry professor who will be using the new supercomputer. “If you had a calculation that might take a year to run on your desktop computer, with Graham, you might have it done by lunchtime.”

Scott Hopkins infront of the supercomputer Graham

Graham, $17-million new supercomputer at the University of Waterloo and Scott Hopkins, Chemistry professor who will be using new supercomputer.

Over the course of 50 years, computers have spawned an information age with massive data sets growing at exponential rates. The modern world swims in a sea of data, and researchers want to exploit that data to push the cutting edge of discovery.In every field of study, from drug discovery to automotive safety and from astrophysics to climate change, researchers want to run huge computational jobs to analyze data and create simulations or models they would like to test.

Six kilometres of cable connect supercomputer units

From his office, Hopkins can see the math and computer building that houses Graham the supercomputer. The refrigerator-sized units are interconnected with more than six kilometres of cabling. Each of those units uses energy enough to power four houses, Hopkins says. The whole system is cooled by water pumped from a local Grand River reservoir through a series of tubes.

Graham operates on a network, with researchers from different universities accessing the resources over the internet. It can be shared by more than 11,000 Canadian researchers at multiple universities, and its resources are managed through Compute Canada, Compute Ontario and SHARCNET, a multi-university consortium in the province. 

Between Graham and the supercomputing resources at the other three universities, “Canada has a new gem for worldwide high-performance computing,” Hopkins says.

Supercomputer will aid research into prescription drugs

His own work involves understanding how clusters of nanoparticles and molecules behave, and that has applications in several areas, including drug discovery. “If I put a chlorine atom here, how does that influence how quickly the drug diffuses through the cell membrane to reach its target?” That sort of question involves running huge quantum mechanical operations to model and predict the physical properties of a drug and to make an educated guess as to whether it might be a viable candidate to bring to trial.

“The things that we can do on Graham, we simply could not have done five years ago. This is just pushing the boundaries in terms of brute force calculations,” Hopkins says. “Much of chemistry is becoming an exercise in applied math and computation.”

For Marek Stastna, a professor in applied mathematics and a computational scientist, that means his team will be able to generate much more detailed pictures of “internal waves” within a big body of water like Lake Huron. These internal waves are important because they can either kick up pollutants such as phosphorous that cause algal blooms, or stir up nutrients that small animals and fish depend on.

“One of the things that we would like to do is build models of the Great Lakes to do a better job of predicting when authorities need to close beaches because of algal blooms. Right now, the models we have are very coarse,” Stastna says.

Francis Poullin

Francis Poulin, Models ocean currents on Graham

Jillian Anderson

Jillian Anderson, Sociological researcher examining big data

Marek Stastna

Marek Stastna, Researches internal waves in Great Lakes

 

Graham will accelerate climate research

But it’s like being an artist who wants to zoom in and capture the tiniest detail on the millimetre scale, in order to get an incredibly accurate large-scale picture. This requires solving huge numbers of equations using massive amounts of computing power.

Graham not only provides the power needed for much bigger computations, but also makes the modelling faster and more efficient. Prior to Graham, it could take 20 minutes for one picture to come up, and if the team wanted to tweak some number, like the amount of phosphorus in a lake, they would have to run the model again, Stastna says.

Another researcher in applied mathematics, Francis Poulin, is trying to better understand ocean currents and the physical processes that drive those currents, which have big impacts on everything from the fisheries to the climate.

Even a small change in some aspect of the environment, such as wind direction or the warmth from the sun, can evolve in chaotic and complex ways to change the circulation of currents. But with Graham, models that might previously have taken months to generate can be done much faster. “That allows us to ask more questions,” Poulin says.

Hopkins says supercomputers will allow for the development of new theories, new algorithms and ways of approaching all kinds of science and engineering problems. “Whether the research is with jet engines or heart valves, or economics, or population dynamics, people can now ask much more difficult questions and receive an answer in a reasonable amount of time.”

Graham will also impact sociological research

Graham will also be useful in fields such as sociology. John McLevey, a sociologist at Waterloo, is developing computational models of the evolution of collaborations in biomedical research and development. Here, too, there are massive data sets, from a wide variety of different sources such as scientific journals, granting agencies and private sector funders from all around the world.

Jillian Anderson, who recently graduated from Waterloo and spent the summer working on this data linkage project in McLevey’s lab, says the Graham supercomputer will be a huge asset to both researchers and students working on these types of problems.

“Currently, we are spending months preprocessing and refining our data, just to get it to the point where analysis can happen,” says Anderson. “A supercomputer will expedite this process, allowing the researchers more time to focus on analysis, which is the important part.”

For the students who might be working in a lab for a summer, it might mean being able to see a project through from start to finish, she adds.

AI algorithms and the supercomputer

Hopkins says one of the exciting possibilities is to run artificial intelligence or machine learning algorithms on Graham, which would enable the supercomputer to infer connections among large data sets. “So, then the supercomputer could essentially tease out the trends for you.”

Graham may also open the door to yet faster and more powerful computers in the future, as researchers use it to model new techniques and methodologies in computing itself, he adds.

The world of big data means that supercomputers will play an increasingly larger role in shaping the future of society and technology. Anderson, who recently moved to Simon Fraser University for graduate studies in big data, is interested in applying analysis of big data to agriculture. She knows, from the experience of her family’s farm in Saskatchewan, that the technology in modern-day farm machinery is producing incredible amounts of data. “A combine can be sampling a data point ever second, and that’s just the combine,” she says.

That’s true of almost anything today, from home appliances to cars and medical equipment. “Our ability to collect data on any topic is growing and that data can be linked together and analyzed,” Anderson says. That means supercomputers and the analysis of data “will affect every aspect of our lives,” she adds.

Looking back: A history of computing at Waterloo


1962: Wes Graham is appointed director of Waterloo’s new Computing Centre.

1965: Undergraduates Gus German, Bob Zarnke, Richard Shirley and Jim Mitchell write the first WATFOR compiler program.

1967: Faculty of Mathematics is founded and the IBM 360/75, the most powerful computer in Canada at the time, is acquired. WATFOR 360, under the leadership of Paul Cress and Paul Dirksen, is released the next year.

1980: Waterloo research team led by computer science professors Keith Geddes and Gaston Gonnet, create the Maple programming language which allows computers to deal with advanced algebra.

1984: Oxford University Press announces that the University of Waterloo will play a major role in computerizing the Oxford English Dictionary, which leads to OpenText.

2002: University of Waterloo establishes the Institute for Quantum Computing (IQC).

2005: School of Computer Science is renamed after alumnus David R. Cheriton, the billionaire computer scientist, businessman and philanthropist who received his master’s degree and doctorate from Waterloo in the 1970s.

2017: New supercomputer Graham is officially launched at the University as part of a new $17-million data centre.

Historic photo of Wes Graham with students.

Wes Graham with students, 1963, University of Waterloo Library. Special Collections & Archives. Kitchener-Waterloo Record photographic negative collection.