Have you ever wondered if mayonnaise could be an instrument? How about a bottle of ketchup? Or that empty coffee cup sitting on your desk? Well, we aim to answer that question with a definitive YES!
As both novices and experts alike might know, getting started with music production can be tough, and it can be even harder if you don’t already play an instrument to begin with! There are tons of concepts to learn and understand, and the most common ways of applying these concepts (Synthesizers, Digital Audio Workstations, etc.) can be intimidating. Once you get your head around the often unintuitive user interface, actually using the knobs, buttons, and faders to experiment with sound can be a real technical challenge. Often times, unless you really know what you’re doing, there is little room for play and creativity in digital music production.
Our vision is to create a new type of MIDI controller that is both intuitive and advanced enough to let anyone experiment with music production. With our project, we hope to broaden the definition of “musical instrument” to include regular, household objects. Through Augmented Reality, Computer Vision, and Physical Sensors, our controller will let you generate and send MIDI signals to your computer by interacting with your environment, in real time. Picking up an apple might generate one sound, while a jar of mayo would produce another. Moving these objects around in space would then change the sound you’re generating by augmenting the parameters of the MIDI signal.
So come along with us on our journey of creating a fun, new, and intuitive way to experiment with music production!
Gantt chart: here