Games Institute postdoctoral fellow Dr. John Muñoz and NASA Distinguished Research Associate Dr. Alan Pope are presenting their research on biofeedback and games at the Games Developers Conference in San Francisco on March 21st. GI researchers were able to attend a test run of their presentation “How NASA Has Translated Aerospace Research into Biofeedback Game Experiences.” This talk covers research on how NASA scientists have used biofeedback technologies to create physiologically adaptive gaming experiences for 30 years.
Dr. Pope started this research by tracking the physiological state of pilots during flight. Later, he realized the similarities between flight simulations and games and started adapting games to measure similar responses. Dr. Pope and his team produced an index for measuring biofeedback that indicates stress and attention and used games to visualize a person’s physiological and psychological state. Wii games like Link’s Crossbow Training were modded to help players differentiate between mental states. For example, the cursor or crosshairs are disrupted when a player’s brain waves show they are inattentive and only stabilize when the brainwaves are focused. This research an determine what kind of conditions make someone alert and able to do their job safely and what conditions create stress and distraction.
Dr. Muñoz first worked with Dr. Pope during an internship with the National Institute of Aerospace in 2018. He assisted the team with applying the biofeedback VR technology they had developed at NASA to the modern VR context. The researchers wanted to “bring the physiological into the virtual reality world.” Therefore, Dr. Muñoz developed ways that this technology could be used to create more personalized and adaptive training, such as enhanced combat training for the military or conflict de-escalation for the police.
Dr. Muñoz also discussed the games he has been prototyping using this technology at the Games Institute. One game generates a dreamlike natural environment of random animals and trees based on meditative brain waves. Another is a “face your fears” game in which a spider attempts to attack you, and the player must stay physically calm to defeat the spider.
Game developers can use these biofeedback mechanics in numerous ways in their games, including controlling input (as in the Wii mods), adjusting difficulty (such as how enemies react), or modifying the environment (weather or music, for example) of the game. Dr Muñoz plans to fully prototype and packages these different biofeedback tools and make them available to the game design and development community. He has recently finished the “excite-o-meter,” a Unity plugin for integrating heart activity and movement analysis into XR games.
Drs. Pope and Muñoz are also considering how to use this technology in space to reduce the impact of social isolation on astronauts. For example, VR “earth immersion,” interactive VR games for training, and multiplayer games for entertainment. Drs. Munoz and Pope will be delivering their final presentation at GDC in San Francisco, CA, on Monday, March 21, 2022, during the Future Realities Summit.