Blog

Making Sound through Synthesizers

Project 1 is finally complete but it was completed not how I initially planned. I was think its better to be ambitious than to do something that too easily reachable.

Initially, I wanted to create a real time garage band interactions between multiple users that utilized mobile devices but it did not stop there. Furthermore, I wanted to create a sort of visual aspect to this entire experience. One based on the way you played your instrument and how it connected to other not just through sound but through art.

But soon I found that this project, although achievable in time, was not something that was practical for the time that I had. So I adapted the experience. Instead of multiple instruments with different notes that were effected by the alpha, beta, and gamma positions of a phone I would just create how a synthesizer would be effected by these values. The outcome a someone decent – slightly screechy – so close – but so far – sound which would be the controller of the visual aspect of my design. However, I was still trying to figure out that visual display now that I didn’t have different instruments. But I started looking in the p5.sound library which held the key to my project the FFT where it analyzes songs and assigns a value to the different frequencies and amplitudes of the song by putting it into an array and with a few extra lines of code give a visual representation of the song.

However, I was not using it to analyze a song but a realtime garage band of noises created by individual movements of phones. An aspect that would need a socket.io and communication through servers.

screen-shot-2017-03-24-at-11-27-37-pm.png