A room-scale experimental project to visualize the sound using lights.
Pitch is mapped to color, and volume to the number of lights turned on. As
such, the room decorated with the lights changes color in accordance with the
sound of activities inside it (e.g. playing music, conversation)
My Contribution
I led a team of 5 people to do this project, and wrote most of the harder
parts of the code myself (including the sound processing and socket server to stream the
processed sound from a laptop to the Raspberry Pi)
Main Features
Technology I Used
Smoothening
The problem with this project is that audio readings tend to be very
jarring, with discontinuous bursts of sound followed by silence very
rapidly. Because of this, the pitch and volume changed dramatically far too
quickly, and would have led to drastic, jarring changes in the lighting. To
account for this, we implemented moving averages in order to smoothen out
the transitions.
Streaming Audio Values to Raspberry Pi
Due to limitations of the Raspberry Pi (it can't simultaneously utilize
its sound card and control lights), we're temporarily using our laptop mics
to process the audio. Due to this, I designed all the code regarding sound
processing to be completely independent of the light controls, and to make
sure the two components (laptop and Raspberry Pi) knew as little about each
other as possible. Thus, the laptop streamed only pre-smoothened values of
pitch and volume to the Raspberry Pi. I used the websocket protocol over
TCP/IP to avoid latency issues.
Future Work
I'm going to improve this after I'm done with school, just because it's a cool thing to have in my room. Here are my planned changes:
Processing the sound on the Raspberry Pi with an external sound card (thus eliminating the need for
laptop)
Fade transitions for the light changes
A continuous spectrum of colors based off pitch (it's a discrete set of
10 colors right now)