Behind many modern blockbusters’ visual effects, complex equations keep computer-generated elements moving and interacting according to the laws of physics
This article by Erica K. Brockmeier was published originally in American Physical Society News on March 7, 2025. It is reproduced here with permission. The original article can be accessed at https://www.aps.org/apsnews/2025/03/lights-camera-physics-simulations.
With its sweeping desert landscapes, nuclear explosions, and 1,300-foot sandworms, Dune: Part Two is replete with visuals that blend filmed footage with computer-generated imagery. The movie was recently honored with two Oscars, Best Sound and Best Visual Effects, the latter of which is given to teams that successfully address complex visual effects challenges while elevating the film’s narrative in an artistic and original way.
Behind the visual effects of many Hollywood blockbusters, including this year’s and other recent visual effects Oscar award winners and nominees, is an undercurrent of complex equations that help ensure the film’s computer-generated elements follow the laws of physics. And while they may not always be apparent to moviegoers, they bring to life some of modern cinema’s most breathtaking and iconic visuals.
Whether you call them physics engines or “sims,” these simulations use principles from classical mechanics — the physical theory that describes how forces like gravity, friction, and elasticity impact an object — to recreate motion of objects in 3D space. The objects can include rigid bodies, or any object that doesn’t deform, as well as soft materials like cloth and hair or particles like smoke, dust, and sand. Computer-generated images are created through a process called rendering, where simulations and object geometries are combined with data on how light travels through a scene and bounces off surfaces to generate an image.
So why do films, especially ones with computer-generated visuals, need physics simulations? For Christopher Batty, an associate professor at the University of Waterloo whose group develops physics simulation tools for visual effects software, part of the goal is to “offload the physics onto the computer so the artists can really focus on their vision,” he said.
“A lot of the types or scales of phenomena that we want to try and capture on film are too much for an artist to do by hand,” explained Batty. “Before simulations, if a character was wearing a cape, for example, an artist would have to go in by hand and control each individual piece of geometry. Now, we can have an artist model the initial shape of the cloth, hit play on the simulation, then the cloth does whatever it should do according to the laws of physics.”
The scales seen in today’s blockbusters are incredibly large and complex — imagine animating all the hair follicles in an army of chimpanzees in Kingdom of the Planet of the Apes or every piece of rock and dust that make up a fictional planet’s ring in Alien: Romulus. That scale also means a lot of computing power: The visual effects for second Avatar film, for example, took up 18.5 petabytes of memory, including a single shot that took 13.6 million hours to render in concurrently-running “render farms.”
Along with the challenges of scale faced by visual effects teams, certain natural phenomena, like water, have historically been difficult to simulate and realistically render. While the motion of viscous fluids can be described using the Navier-Stokes equations, creating believable, computer-generated water, from mist and droplets to vortexes and tsunamis, required decades of work by specialized teams working across multiple films, explained Mike Seymour, a senior lecturer at the University of Sydney and one of the founding members of fxguide.com.
“When you really got through to solving Navier-Stokes equations was when you were at the point that you were getting visuals that could be incredibly accurate,” said Seymour. “There's a lot of physics in visual effects, and to make it look realistic, we have to be able to understand it, manipulate it — and we have to model it to excruciating levels of detail to give it enough perceptual aspects that make you believe it.”
“Humans are good at identifying when things are not behaving in a physical manner, with fluids being one example — if suddenly its volume changes, or it flows in a strange direction, you know it looks off,” added Batty, whose group works on liquid and gas simulations. “Having good ways to simulate fluids is a matter of necessity to capture visuals that are convincing.”
Water was likely the last thing on the minds of the visual effects team for Dune: Part Two, who instead faced the task of seamlessly blending real dust and sand captured during principal photography with computer-generated particles to create natural-looking landscapes. The fictional desert planet of Arrakis also had to be populated with computer-generated spaceships, spice harvesters, and a final battle with exploding atomic weapons.
And, of course, there’s Shai-Hulud, the colossal sandworms that are longer than an aircraft carrier — whose motions don’t quite follow the laws of physics but instead, Seymour explained, strike a fine balance between artistry and audience believability. “If you think about how to move the sandworm through a bunch of sand, there would be so much resistance, but you also need it to move at lightning-fast speeds. Paul Lambert [the visual effects supervisor] and his team knew when to stick with the physics versus when there’s a different visual that works,” he said.
The physics involved with rendering hair and fur for digital creatures were simulation-related challenges faced by three of this year’s visual effects nominees — Better Man, Kingdom of the Planet of the Apes, and Wicked. The other nominee set in outer space, Alien: Romulus, features a space station crashing into the ring of a planet, with billions of simulated rocks and debris particles launched in the wave of the station’s destruction.
Seymour said he was especially impressed with the integration of Better Man’s entirely digital chimpanzee main character into complex live-action sequences. “The visual effects community understands what it is to do invisible effects, but if you're not a professional, it’s hard to understand that a baby ape sitting in the bath with his grandmother washing his hair is an astonishingly hard shot — and ticks about every simulation box possible,” he added.
And while this year’s visual effects Oscar winners might not have thanked physics in their acceptance speeches, simulations often get their kudos during the Academy’s Scientific and Technical Awards, which recognize software, tools, and innovations related to filmmaking.
One of this year’s awards will go to the developers of Ziva VFX, a program for constructing and simulating the muscles, skin, and subcutaneous fat that make up a digital character’s frame so they react to a character’s skeleton in a realistic way. Other awards given to physics engine-related advances in recent years include the Fizt2 elastic simulation system for skin and muscle animation, the Taz Hair Simulation System for generating hair styles, and Houdini, the 3D animation software used across the industry.
And while awards season focuses on specific films, Seymour explained that it’s the iterative process of building up tools, knowledge, and experience film by film that has produced today’s successes in visual effects and will power the field in the future. “The Oscars recognize the crew that did a specific film, but the crew would be the first to acknowledge that they're building on a history,” he added. “You don't put a major visual effects house together for one project, disband it, and expect to come up with new innovations in physics.”
To that end, Batty and his group will continue studying ways to capture more of the diverse behavior of liquids to make computer-generated imagery feel even more real. “In the early days of CGI, we knew how to do a couple of simple types of simulations. Now, they've gotten more advanced, in part by trying to incorporate more accurate physics, using what an engineer or applied mathematician might consider to be the correct equations,” said Batty. “The sky is really the limit as far as the level of detail and accuracy.”