Category: Official

Michael’s Status Report 5

This week I successfully implemented a working algorithm for peak detection. I first implemented solely the dispersion based algorithm, but it was too sensitive and would detect peaks even when I would hold the controller still. After tuning the parameters a bit and adding some more logic atop the algorithm, I was able to pretty consistently detect peaks with realistic controller motion. I still have work to do in further tuning and testing in this regard.

I also started integrating the gesture detection code with Jason’s work with MIDI output. This is in progress as of now.

Jason Status Report 4

This last week, I kept working on the Tone.js audio module and developed an interface for Tone.js and Flask, allowing the team to develop simultaneously. Audio functionality now includes polyphony and the ability to configure effects chains, giving the user control to the sequence of effects, as well as the parameters for each effect. The switch to online classes did not affect my roadmap for this project in any significant way.

Michael’s Status Report 4

In light of the recent events of COVID-19, we have been spending this week setting up our project to work from three remote settings. The only setback from social distancing is that we now don’t have access to the campus labs or facilities, which we usually use to work together in person. We had also planned on fabricating a custom grip for our smartphone, but we had planned on 3D-printing the parts in the facilities that are now closed. Instead, we are using a more makeshift approach involving velcro straps to secure the hand to the smartphone. However, most of our project is software-based and we all have the required parts to get the platform working on our own setups, so there was no further refocusing required for the rest of our project.

During this week and the following, I plan on be exploring the different algorithms for peak detection as found in the paper in my last post. So far, I have altered the logging tool used for my data collection to be used by the different peak detection algorithms on the Python side of things. Jeffrey is going to be concurrently implementing the dispersion based algorithm for peak detection alongside me.

Jeffrey’s Status Report 3

This week, I initially focused on restructuring the code in our project. For our front-end, all the packages we had installed were installed manually by downloading and moving Javascript files for that library into our project folder. We knew this wasn’t scalable if we wanted to install more packages. So I set up NPM within our project so that packages could be installed with a package.json file and a command-line instruction. In this way, we wouldn’t have to manage versions for packages and could instantly get this project set up easily on any new computer. From this initial setup, I was able to install Socket.IO and Tone.js, the package we decided to use for sending MIDI instructions.

Additionally, I also experimented with buttons and sending specific inputs from the React Native phone application to the computer. I also looked into how to give user feedback from pressing specific buttons. Currently, there are several packages that give me the ability to send a vibrate command to the phone with a function.

Besides this, I spent the rest of my time preparing for the presentation and making slides. Specifically, making the block diagram for the project gave us all a better understanding of how each component would work and the functionality of our system.

Jeffrey’s Status Report 2

For this week, I focused on solving the problem of not being able to establish a stable connection with the remote from my phone. I tried different sequences for how commands were sent for notifications and in a certain order/timings and were unsuccessful. I also tried to automatically reconnect to the remote after disconnecting by having a listener on disconnect. This also was also not a viable option because the time between disconnect and reconnect was about 400 ms. This disconnect would happen anytime between after 10 – 15 seconds after connecting so connecting to the remote from our phone just didn’t seem like a possible solution anymore.

Because of this, we had the idea to use the laptop as a place for central processing again and use Web Bluetooth since that was the only example that was able to work. To use this data from a front-end application, we needed a central place where one program could interpret both phone input from the user and sensor data from the Gear VR. To solve this problem, we decided to restructure the architecture by having a Flask and Socket server receive data. To connect the remote to this server, data was still being streamed to a front-end web application. However, once this data is received, it is instantly socketed over to Flask by using Socket.IO. For the phone, I installed Socket.IO in React and was able to use the phone as a client and send messages to the server. One thing is that the phone and laptop must be connected to the same network.

Michael’s Status Report 3

This week I continued working on the sensor data visualization tool. I was able to extend it to the gyroscope and touchpad sensors. Drawing a circle on the touchpad (without pressing down on the buttons) gave this output:

There would be some further calibration needed if we wanted to fine tune spacial detection on the touchpad, but for the purposes of swipes, it worked rather well. I began working on finding gestures from these datasets and decided to continue working on the tool as I needed. With gestures, I was able to see distinct peaks and troughs in acceleration values when I would physically do a sharp movement indicative of a note change. The main issue was that the data was coming in one point at a time, so we would need to find a way of analyzing realtime time series data.

I found an article on doing this, and will try implementing a similar sliding window algorithm.

https://pdfs.semanticscholar.org/1d60/4572ec6ed77bd07fbb4e9fc32ab5271adedb.pdf

Jeffrey’s Status Report 1

For this past week, I attempted to understand how we would connect to the Gear VR remote controller. We had several options to consider that each had specific tradeoffs. The controller could either be connected to the laptop and have sensor data be streamed there or we could have the controller stream data directly to the phone and eliminate the middle man (laptop). Initially from our research, we found someone that was able to successfully connect to the controller to a laptop. However, this was through the Web Bluetooth API and at the time seemed restricting since we would have to make our application on the web. For this reason, Michael decided to tackle the task of trying to connect the VR remote to a python program so that we would have access to this data from within python.

I focused on a different approach by trying to figure out how to connect the remote to the phone directly without having a laptop involved. In this case, the remote would act as the Peripheral and the phone would act as the Central instead of the laptop. We considered this approach because it would be a much more user-friendly experience to only have 2 components to use our application instead of having 3.

I initially started by looking into ways to develop for the phone and found React Native to be the most compatible and easy way to develop for either iPhone or Android. I started by looking into the best platforms to develop with React Native and found Expo.io. However, after trying to install the Bluetooth4 packages, I realized they were not compatible so instead decided to build a React Native application from scratch. After setting up the project, I tried two different Bluetooth packages: react-native-ble-manager and react-native-ble-plx. For react-native-ble-plx, I was able to successfully connect to the Gear VR but was unable to send notifications. I then tried the other package and was able to successfully connect and receive sensor data but was unable to establish a stable connection.

Michael’s Status Report 2

This week, the team decided to pivot to the web-bluetooth approach using a Flask server. Using this method, we were able to easily connect to the GearVR controller, enable notifications, and write commands to characteristics. With this, I started working on a javascript tool for us to visualize GearVR sensor values received in the notification buffer.

I found a javascript chart-creation package named Chart.js which could make different types of plots based on JSON data. Since our project should only register motion and play notes when the trigger is held on the VR controller, I decided to case on the trigger-on notification during the stream of real-time notification buffers. I tried parsing and plotting for the acceleration values first. Once the trigger was pressed, my code would start logging the sensor values, and would plot the x, y, and z values on a line chart once the trigger was released.

In this case, the y-axis is in m/s^2 and the x-axis is in “ticks”, which is approximately .015 seconds (as the average notifications per second was 68.6 notifications). Orange is x, blue is y, and green is z.

Michael’s Status Report 1

This week I worked on connecting to the GearVR controller via Python. This would use the laptop as the central device in the low-energy bluetooth connection with the controller, the peripheral device.

I tested out a several different Python packages that would meet the requirements of:

-Scan for nearby BLE devices and view peripheral services and characteristics
-Connect to device
-Read and Write to characteristics
-Enable and receive notifications from device

I ended up choosing a package named Bleak (https://bleak.readthedocs.io/en/latest/). From there, I was able to scan for the GearVR and view the available services. The usual services found in most bluetooth devices were listed, such as battery life and device info, but there was one service with an un-stringified hex. I assumed that was where the sensor related characteristics were located.

The next step was to connect to the controller. I was only able to successfully connect to the controller using my laptop for approximately 20-30 seconds before the connection was dropped by the controller. I also tried writing code so that I could enable notifications and write commands to characteristics during the time that the connection was stable. The ultimate dagger was when enabling notifications, I would get a BLE error code that ended up being a system error from my operating system. After doing some research online, we found out that it was specifically designed to be connected with Samsung products through their application, so my Windows system trying to enable BLE notifications was getting denied.

I tried another package, PyBluez, which also failed due to the same error. The direct bluetooth connection approach with the laptop wasn’t an option for this particular controller.

Jason’s Status Report 2

This week, I worked on playing sound through Python and trying to implement a wavetable. This proved to be very tedious and an unreliable method of generating sound, as smooth transitions between notes are not guaranteed and this introduced some noticeable latency to the program compared to GarageBand output last week. We decided that we should pivot from generating our own audio, as this introduces too much complexity into our project. We started looking at alternatives to audio generation, hopefully settling on a good one next week. By using a developed audio library, we can free up a lot of time for development of features that we think are more important to this project (user experience, polyphony, motion classification).