This week I started by writing and testing the different lighting schemes, and set up the lighting Maven program to accept the Acousticness, Danceability, Valence, Energy, and Tempo data from the Spotify Web API relayed through the main RPi core and modulate the behavior and rhythm of the lights accordingly. I integrated the new set of lighting fixtures and signal transmission hardware, and made sure they worked as intended and exactly like the set of lights we were using before. I also tested the functionality for daisy-chaining multiple lighting fixtures up in a row, all controlled by the same RPi program. Next I worked on making the communication work with the lighting program, such that on input on the web app the lights would rotate their colors, and developed a different easier to understand lighting script for this. I also helped work on the final presentation slides, and helped create content for Matt to discuss in the presentation.
Testing:
Queue Latency: 102 ms (20 trials)
Recommendation Generation Latency: 6.2 s (20 trials)
Queue Capacity 100 Songs: passed (memory usage & latency acceptable)
User Stress Test: 200 Users passed (memory usage latency acceptable)
Concurrent User Stress Test: ran out of memory when trying 20+ users all adding songs. This may be due to inefficient usage of data structures and redundant memory copying. We plan on investigating this further in the lab.
UPDATE: RESOLVED we went into the lab and fixed the problem. The memory usage with for 5 simultaneous requests from different users each second over 30 iterations each consumes less than 2% of available memory.
We are approximately on schedule.
Next week I hope to continue working on integration, and help make the posters and testing the final demo.
One of the most significant risks that could jeopardize this project is not having access to the lighting fixtures and accompanying devices during these last few weeks for experimentation, testing, and demonstrations. This risk is being mitigated by ordering the same parts for ourselves so that we will have full control over a proper set of lighting equipment.
A secondary risk that could jeopardize this project is having trouble integrating all the modules of the system together on the Raspberry Pi’s. While these processors are indeed capable of multithreading, to operate the core queue manager or the recommendation service simultaneously with continuous lighting control signal generation, this may slow down the subsystems perceptively. This risk is being mitigated by having a third Raspberry Pi on reserve just in case it is necessary to dedicate an entire Raspberry Pi computer to each major subsystem.
In order to mitigate the risk of not having enough time to run as many or as comprehensive of tests as necessary, we started building our testing systems early last week.
No changes have been made to the system diagram or the project schedule.
I spent this week trying to understand the new lighting fixtures, and creating the different lighting schemes for these IDeATe Lab lights instead of our original Furious Five RG lights. Besides locating and procuring the lights, the ENTTEC DMX to USB converter, and the proper cables and DMX terminator plug, I ordered a set of all these components through the Purchase Request form to ensure that we will have an operating set these coming weeks. I also deciphered the cryptic channel & intensity mappings for the new SlimPAR PRO Q USB lighting fixtures, as the channel IDs sensitized by the Java port of DmxPy did not align exactly with the values provided in the User Manual for the lights. Using this mapping I was able to create the first two lighting schemes we would be using, and successfully tested them by operating the lights in real time with the Java processes administering the patterns. The video below shows one lighting test sample, and demonstrates that we are finally generating reliable DMX signals and controlling the lighting at a fine-grained resolution using a lighting microservice operating right out of the recommender module (which was previously demonstrated to work correctly on the second RPi):
https://drive.google.com/file/d/1pNcK16PDUi-hqoaK9zqlAS_2xBKROgXs/view?usp=sharing
My progress is on schedule, however I would have liked to help start module testing a little earlier.
In the next week I hope to finish the rest of the lighting models & integrate the light system, collect more survey & testing data, and polish the whole system while checking complete end-to-end functionality.
==========================================================
As I have worked on this project, there are a few new tools and pieces of engineering knowledge that I have acquired as necessary for different parts of my project. For example, I learned how to use a Raspberry Pi, how to use web sockets under a Springboot framework for a live service web app, and how to control hardware lighting fixtures without on-board processors using a computer program by generating & transmitting DMX signals. In order to learn these skills, I had to use a range of different learning strategies, my primary methods being online research, following public tutorials, and reading documentation from hardware suppliers’ websites. I also consulted previous ECE Capstone groups for their expertise in utilizing certain technologies (such as controlling a DMX lighting fixture using a computer program), or asking my teammates (to learn more advanced version control tactics on Github) learning how to do different things directly from a real person. I also applied the programming strategies I learned from classes at Carnegie Mellon, such as Web Apps & Distributed Systems.
I started this week by working on system integration and verifying that our whole project was operating smoothly end to end. This was in preparation for the interim demo that we had this past week. This involved some code refactoring and submodule implementation, as well as manual testing in the lab to make sure we could have a polished demo. I also went and acquired a lighting kit from the IDeATe lab, getting two new DMX-capable lights, more necessarily cables, and the ENTTEC DMX to USB converter, which a previous group had told us was required to transmit signals from our laptops/Raspberry Pi’s. I set up the new lights and compared their functionality to the previous lighting fixture we had by testing a few different DMX signal generation libraries. I also wrote a few different configurations of Java test scripts to see if they would better and more organically than the older lighting fixture. Below is a video of setting different colors at regular time intervals on the new lights, which are simpler but may have a lower barrier of entry:
https://drive.google.com/file/d/1ZJ4hMDnmAVAk5n2S4THXyMkTPESUGbx9/view?usp=sharing
Unfortunately, we still do not have that great of an understanding of how the different channels and intensities work for the DMX controlled lighting fixtures, especially since these new lights did not come with a user manual. This will definitely require some more systematic testing to reverse engineer the behavioral specifications of the lights next week.
Verification: I will use timestamped requests to determine the user input to internal system round-trip time. This will help us determine the effective latency of our web app, backend, and main RPi core system for our use-case requirement of a responsive & tactile system for users. We will also create a test script that simulates multiple users to check that the system can maintain websockets for and accept requests from 50-150 concurrently connected users. This will verify our system’s effective capacity and robustness. In terms of testing we have already done we have only ran basic functionality and behavioral correctness tests for the queue, recommendation & Spotify API request systems, and the web app.
My progress on the web app and queue manipulation core modules is on schedule, but the lighting system is behind schedule. There is no way to catch up on progress in regards to the lighting system besides focusing on it and spending more time on the lighting modules, which I intend to do next week.
Next week I will be putting much more focus on the lighting system, as there are still some issues with controlling the lights finely. I will attempt to learn how to use them better and have some skeleton lighting schemes ready for integration with the other subsystems functioning on a basic level.
The most significant risk that could jeopardize this project is still the lighting system. Since our lighting fixture displayed some behavior before starting to fail consistently, we need to test some other configurations, or consider obtaining a second set of lighting fixtures. We plan to mitigate this risk by borrowing equipment from the IDeATe lab: first, we will borrow an ENTTEC DMX to USB converter, which a previous ECE Capstone team (Group D4) told us they used. If this does not work we will borrow their DMX lights as well, to test whether it is an issue with the fixture. Our second concern is managing real-world song timings, as the Spotify Web API does not issue us the capability to wait for a callback when a song finishes playing, although it does return the exact duration time of any track. We intend to mitigate this risk by experimenting with different clocks and internal song tracking. The final potential risk is keeping track of which Users have recently engaged with the app, in order to determine the majority count of active users. We plan to tackle this issue with keep-alive signal communication with our web app clients to check if Users have engaged with our service within the last timeout epoch.
No changes have been made to the system diagram or the project schedule.
Much of this week’s work was done together as a team in the Hamerschlag labs testing our project ahead of the Interim Demo next week. There were a lot of hours of recompiling, retesting, and refactoring our code as we debugged and made sure our modules worked in the lab demo environment. This was done to ensure that our end-to-end functionality behaved as intended. Additionally, I changed our code so that the User requests would work as an async function, so that the queue displayed on the Web App would update instantaneously, without waiting for the Spotify querying round-trip time. This improved the ‘snappiness’ or our app, making it appear more responsive. Furthermore, I worked with Matt to revamp our Like/Dislike system to serve Users the specific Like button configuration that is unique to them. If Users have queued a song themselves, that song should initially appear as already Liked, and the other like buttons on the songs on the queue that they did not request should initially appear as blank (no Like or Dislike clicked yet), as well as updating the Like counts of each song accordingly. Previously the states of the buttons would flip if different Request buttons were pressed, now the correct button state and behavior is shown for each User’s own web app client. I also tracked down the ENTTEC DMX to USB translation unit as well as the DMX to XLR/DMX cable we would need to integrate it with our Raspberry pi and lighting fixture in the IDeATe lab. Hopefully we will have that early next week.
Progress is on schedule, synchronized to the timing of the Interim Demo.
Next week, I hope to be able to run the lighting test script and produce some controllable lighting fixture behavior, and begin integrating it with the rest of our codebase such that the Raspberry Pi automatically controls it. I will also improve our app backend and queue controller modules, specifically in regards to changing our song veto system to only consider veto votes (dislikes) from active/recently online Users.
This week, in close collaboration with Matt I updated the web app backend as well as the queue manager, began the lighting control application, and worked on general system integration to make sure our modules were collaborating properly. On Sunday I spent a couple hours with my team in the Hamerschlag lab integrating our systems, and was able to achieve end to end functionality for getting a song request (and song recommendation requests) from a User through the web app onto the main system, and then to the Spotify Web API to make a ‘Play Song’ query and finally to the bluetooth speaker audio output. This was demoed in our meeting this week. During the week I took the lead on adding the Likes & Dislikes user inputs on the web app, and tying it to the song data stored by the backend on the queue manager. The Likes & Dislikes is a critical feature as it drives both the recommendation system and the song veto capabilities: songs with more Likes are given more consideration by Luke’s recommendation service, and songs with low/negative likes downvoted by the majority are vetoed and removed from the song queue. I updated the frontend so that users could interface with this functionality and the data models in the backend, from which I accessed the song voting data on the queue manager. Attached below are some examples of the updated UI and the song veto/remove from queue functionality:
I also created and worked on the lighting fixture control module, as we received our DMX controlled lighting unit and DMX to USB cable this week. I started a new Maven project and wrote a test script utilizing the DmxPy library (ported over to Java for coherence) to toggle the different channels of the lights in a regular manner. This was done to both check that we could, indeed, control the lighting fixture via DMX signals generated by our own software program, and to begin the steps for a persistent controller microservice to continually operate the lights based on which songs are playing/on the queue. We were able to start and run a simple lights show off this lighting application: [ video here ]
Our progress is on schedule. Our project now has the core functionality in place and the peripheral features are becoming fully constructed.
Next week we hope to make more progress actually getting the lights to work properly and in sync with the queue and the data received from the Spotify Web API. We are also looking into some song start/stop timing mechanisms in order to for the internal system to know what song is currently being played by the Spotify Web API.
This week, in collaboration with Matt I rebuilt the Web App and Queue Manager backend module for processing User requests and storing the collaborative song queue. Now, the songs and the song queue are stored in local memory by the backend process, and the queue state is forwarded to each User’s app client via their web socket connection. When a New User logs onto the app they can enter a Username to be displayed next to their song requests, and they are immediately sent a JSON payload containing the current queue. Users can send a new song request to the server through a form, and it will appear immediately on the next slot in the queue on all Users’ app views simultaneously. I also polished the frontend side of the app for better readability and queue consistency among distributed instances of the web app.
This new Java backend (and tweaked frontend for cleaner queue display) will cooperate nicely with our Spotify semantic match and bluetooth ‘Play’ functionality, as each successive song can be popped off the queue and their song details can be used to find a match and be played from the Spotify song database. The Song datastructure managed by the queue hosted on the backend will contain all the information necessary to generate a query to the Spotify Web API.
Our group is making very good progress on schedule. This week were able to implement much of the fundamental functionality of the project, and now every major component of our project (besides the lighting, as the lighting fixtures we ordered just arrived at the end of this week) is well fleshed out.
In the upcoming week we hope to finish or make significant progress in integrating our modules together. More specifically, we want our Web App backend to be able to communicate which songs are on the queue to our Spotify Web API facing Player modules to actually play them on the speaker. Additionally, we will do a quick sanity check on the lighting fixture to make sure the DmxPy library is capable of controlling the lighting off the Raspberry Pi, and perhaps begin writing a few of the color scheme scripts we will be running on it.
The week preceding spring break I looked more into the lighting fixture control program leveraging DmxPy. I conversed with a previous ECE capstone group that also used Spotify Web API data to categorize the different genre/moods/types of songs and transmit different control scheme DMX signals to operate the lighting devices. I also spent a considerable amount of time starting the skeleton, writing content for, and organizing the team to efficiently finish the design review report.
Progress is on schedule
The next week I will finalize the lighting fixture choice and begin writing a basic control script as a proof of concept, and help make the main RPi backend microservices for accepting user requests and issuing play API calls to the Spotify Web API.
The most significant risks that could jeopardize the success of the project have not changed much so far. More specifically, the first biggest risk is not being able to properly compile and run code for controlling our light fixture automatically through our control program, which would use Flask, Python, and the Open Light Architecture framework to transmit DMX signals to the lighting system. Our concerns are due to comments given on other people’s projects attempting to control lights using the DMX protocol that the OLA framework is a little finicky and difficult to bootstrap, even though after initial setup progress should be smooth and predictable. To mitigate this risk we will be testing our setup before committing completely to OLA. Secondly, another major risk would be not being able to maintain and reason about the different websockets our Users would connect to our DJ system through, as maintaining this live User network is a big part of our use-case. We are mitigating this risk by building these modules early. Finally, the last major concern would be making good persistent programs that thread well on the RPi’s without crashing, as we want our DJ to have near 100% uptime, as we consider even a brief stop in the music playing a fatal error. We can mitigate this risk by researching more into robust microservice programming.
As we just recently gave the Design Presentation, within that presentation was our most up-to-date system design, and there were no changes made to our system design after that.