We often think of NASA as outward-focused, but much of their collected data actually looks back at the earth from space. The NASA 2018 International Space Apps Challenge set the challenge of using earth imagery data sourced from the Global Land Cover Facility MODIS data set to create art.
Team Salinity, a group of students and recent graduates from the Faculty of Environment, developed SongSAT as a tool to express the beauty of satellite imagery through sound. Beyond the remarkable audio experience of the music that this creates, the software provides an opportunity for the beauty of satellite imagery to communicate to an audience with visual impairments, providing them with an opportunity to appreciate the wonders of the world from above.
The team produced an algorithm that converts four distinct geographical areas (grassland, forest, coastal/water, and mountainous areas) into songs with distinct musical patterns. These patterns are converted into playable sheet music, which is then brought into MuseScore notation software that can play the music back.
Millions of pixels are created every day, and we believe that new pieces of music can be generated from this to allow a wider audience the opportunity to appreciate the vast treasure trove of satellite data we have available today. - Alex McVittie
You can hear more sample SongSat outputs from different geographic locations at SongSat. Team Salinity’s algorithm is open source, and is available for everyone to use and experiment with through GitHub.
ENV students (Team “Salinity”):
|Alex McVittie (recent grad)||Janet Hu||Corina Kwong||Colin Tuen Muk|