Translating music into light and motion with robots
Robots the size of a soccer ball create new visual art by trailing light that represents the “emotional essence” of music
By
Media Relations
University of Waterloo researchers are developing robots that can translate music into synchronized light and motion, creating a rich, multi-sensory way to experience sound. Using AI to interpret elements such as rhythm, tempo, and emotional tone, the system enables robots to respond with expressive movements and coordinated lighting effects in real time. This interdisciplinary work bridges robotics, artificial intelligence, and the arts, showcasing how technology can enhance creative expression. It also opens up more inclusive ways to engage with music, particularly for individuals who are deaf or hard of hearing, by transforming sound into visual and physical experiences.
To read the full article, click here!