Team
Salinity,
a
group
of
students
and
recent
graduates
from
the
Faculty
of
Environment,
developed
SongSAT
as
a
tool
to
express
the
beauty
of
satellite
imagery
through
sound.
Beyond
the
remarkable
audio
experience
of
the
music
that
this
creates,
the
software
provides
an
opportunity
for
the
beauty
of
satellite
imagery
to
communicate
to
an
audience
with
visual
impairments,
providing
them
with
an
opportunity
to
appreciate
the
wonders
of
the
world
from
above.
The
team
produced
an
algorithm
that
converts
four
distinct
geographical
areas
(grassland,
forest,
coastal/water,
and
mountainous
areas)
into
songs
with
distinct
musical
patterns. These
patterns
are
converted
into
playable
sheet
music,
which
is
then
brought
into MuseScore notation
software
that
can
play
the
music
back.
Millions of pixels are created every day, and we believe that new pieces of music can be generated from this to allow a wider audience the opportunity to appreciate the vast treasure trove of satellite data we have available today.
You can hear more sample SongSat outputs from different geographic locations at SongSat. Team Salinity’s algorithm is open source, and is available for everyone to use and experiment with through GitHub.
More information:
ENV students (Team “Salinity”):
Alex McVittie (recent grad) | Janet Hu | Corina Kwong | Colin Tuen Muk |