Home

Jeffrey’s Status Report 9

Flushed out the frontend much better as well cleaned up the backend to handle more menus in a more scalable, general way. Added two other menus (key selection and filter selection). For key selection, there are 8 keys and 8 modes the user can pick from in edit mode. I also debugged some of the problems we were having with edit/play mode interactions. Users can now swipe up/down/left/right in edit mode to pick what they want to edit. We looked into some of the latency problems and addressed a few but there is still a base latency built into sockets but have reduced it down to about 15ms on each direction of sending data. We also realized that the frequency of data transmitted out of the controller may influence the general latency but that is something we can’t control.

Jeffrey’s Status Report 8

Created a working phone application with the existing backend. Worked out exactly how we would send data from backend to frontend to keep everything in sync. Decided on the 4 menus we would allow the user the edit when they are in edit mode. Was able to develop several menus including instrument changing and parameter changing for certain effects. To maintain the same state on the front end back end for these editing menus, we decided to have the changes occur in the backend, and on any update to from the controller, that would have to be reflected in the frontend. To reflect this, a socket was set up for each menu and any changes (Ex. current selected instrument) were made by editing the CSS classes/ids responsively.

Michael’s Status Report 9

This week our backend underwent a refactoring due to limitations surrounding certain elements of the frontend that we wanted to use, and also due to latency concerns. Besides that, I spent time this week finishing up the filter select menu (backend and frontend) as well as testing the code for latency.

Jason’s Status Report 8

This week, I finished calibrating all our motion-controlled effects. Then, I worked with Jeffrey to integrate our frontend with real-time effects updating from gyroscope values. Next week, I will work with Jeff further to refine our frontend and with Michael to implement the instrument select with controller values.

Michael’s Status Report 8

This week, I helped Jason in implementing real-time gyroscope controlled sound effects and also continued my work with integrating the code. During integration, I started with the backend part of the controller portion of our MVC in Python, but due to how fast the code was expanding/getting more complicated, we decided to refactor the code into a clean design pattern. The main flow of our projects operation would be between two main modes: play and edit mode. In play mode, which is the default mode, the VR inputs are processed and used for sound output and effects toggling. In edit mode, VR controller inputs are used in navigating a series of menus used for selecting instruments, changing filters, adjusting filter parameters etc., and Jeffrey is working on the frontend such that the user can directly interface with the website using their remotes. At the moment, I am continuing work on the menu for selecting and changing the filter sets.

Jeffrey’s Status Report 7

This week, I helped to finish combining our separate work into several functions on the python server. Got touch/press detection to work on the touchpad so that the user could go up and down half a pitch by pressing up or down on the touchpad. We decided to use this over swipes since these were more responsive than swiping. I also got octave changing to work where the user would touch left or right on the touchpad to go up/down half an octave. We also looked into the problem of latency and found that it was still an issue but isolated the problem to solely the internet that the phone is connected to. I also looked into ways to improve the UI/UX for the application on the mobile and laptop side.

Michael’s Status Report 7

This week our team decided to pivot away from using peak detection for note separation and to utilize different sensor data mapped to certain effects instead. As a result, I’ve spent this week working on setting up the gyroscope data to be processed by the tone library. Besides that, I have been working with Jason on implementing effects that have tunable parameters that we can map to motion data.

Jason’s Status Report 7

This week, I focused on improving existing functionality with Tone.js. I successfully incorporated samples into the playing, so now there is a much greater variety of sounds Caprice is able to produce (piano, strings, etc. as long as a sample pack is provided). Additionally, we also began to brainstorm ways to control certain effects with the motion controller, and a very promising one so far is the 3D panner, which simulates the sound output in a 3D setting (ie. move to the right/left, closer/away, change orientation, etc.). This effect is very interactive and couples well with the gyroscopic data that we can get from the VR controller.