Abhishek’s Status Report 2/18

Personal Accomplishments

Overall, this week a lot of work was done in terms of narrowing down the specifics of how all of the systems are going to work together. As of last week we knew in broad strokes that we needed genre classification, signal processing, lighting, and UI units. Now we are much more clear on what we need. I personally was working on the Expressive Lighting Engine. It was broken up into a few different parts, and is the unit that is closest to the actual lights. I determined the details of how this engine would function. I built out the specification of all the functions that we will allow for including: color change, strobe, fade, rotate, and blackout. We also then determined how we would approach actually selecting what function to activate and with what parameters. We are going to treat the entire set of lights, functions, and parameters as an output space and we will narrow down these outputs based on the current conditions of the lights, the auditory features, and the global musical parameters. From these three sources of information we will eliminate all trivial or unappealing light changes and then randomly select from the remaining “appropriate” light changes randomly. These changes will be triggered by the tempo/beat detection feature out of the signal processing unit. 

On Track?

We are still on track for what we need to accomplish however we have definitely changed up the work distribution significantly. Originally I was supposed to be in charge of the entire expressive lighting engine. However we realized that this unit as well as the audio decomposition and processing will be a much more significant component that originally expected. For that reason I will now be getting help on the Expressive Lighting Engine from Rachana. This makes sense because we also determined that the Genre Classification component of the project is not that integral and almost all the required benefits from that system can be gained using a combination of the Shazam API and the Spotify API in terms of recognizing songs and extracting global parameters from that. That being said this upcoming week will consist of a lot more coding and development and less planning than last week. 

 

Goals for Next Week

Based on the new work distribution my goal for the upcoming week is to begin coding up the Expressive Lighting Engine Logic and Queue. Specifically I will be working on the Queue which will be able to handle function call requests and route them to specific lights. This will also be good to test once the Gigbar 2 arrives since we will be able to use all of its different lights together. I will also be working on the coding up of the LightSet class that will be used to create a virtual representation of the light scene available for use. This functionality will allow the program to be flexible to run on different types of lights given that each performer/DJ who will be using our software may have different equipment. 

Leave a Reply

Your email address will not be published. Required fields are marked *