Home

Jeffrey’s Status Report 2

For this week, I focused on solving the problem of not being able to establish a stable connection with the remote from my phone. I tried different sequences for how commands were sent for notifications and in a certain order/timings and were unsuccessful. I also tried to automatically reconnect to the remote after disconnecting by having a listener on disconnect. This also was also not a viable option because the time between disconnect and reconnect was about 400 ms. This disconnect would happen anytime between after 10 – 15 seconds after connecting so connecting to the remote from our phone just didn’t seem like a possible solution anymore.

Because of this, we had the idea to use the laptop as a place for central processing again and use Web Bluetooth since that was the only example that was able to work. To use this data from a front-end application, we needed a central place where one program could interpret both phone input from the user and sensor data from the Gear VR. To solve this problem, we decided to restructure the architecture by having a Flask and Socket server receive data. To connect the remote to this server, data was still being streamed to a front-end web application. However, once this data is received, it is instantly socketed over to Flask by using Socket.IO. For the phone, I installed Socket.IO in React and was able to use the phone as a client and send messages to the server. One thing is that the phone and laptop must be connected to the same network.

Michael’s Status Report 3

This week I continued working on the sensor data visualization tool. I was able to extend it to the gyroscope and touchpad sensors. Drawing a circle on the touchpad (without pressing down on the buttons) gave this output:

There would be some further calibration needed if we wanted to fine tune spacial detection on the touchpad, but for the purposes of swipes, it worked rather well. I began working on finding gestures from these datasets and decided to continue working on the tool as I needed. With gestures, I was able to see distinct peaks and troughs in acceleration values when I would physically do a sharp movement indicative of a note change. The main issue was that the data was coming in one point at a time, so we would need to find a way of analyzing realtime time series data.

I found an article on doing this, and will try implementing a similar sliding window algorithm.

https://pdfs.semanticscholar.org/1d60/4572ec6ed77bd07fbb4e9fc32ab5271adedb.pdf

Jeffrey’s Status Report 1

For this past week, I attempted to understand how we would connect to the Gear VR remote controller. We had several options to consider that each had specific tradeoffs. The controller could either be connected to the laptop and have sensor data be streamed there or we could have the controller stream data directly to the phone and eliminate the middle man (laptop). Initially from our research, we found someone that was able to successfully connect to the controller to a laptop. However, this was through the Web Bluetooth API and at the time seemed restricting since we would have to make our application on the web. For this reason, Michael decided to tackle the task of trying to connect the VR remote to a python program so that we would have access to this data from within python.

I focused on a different approach by trying to figure out how to connect the remote to the phone directly without having a laptop involved. In this case, the remote would act as the Peripheral and the phone would act as the Central instead of the laptop. We considered this approach because it would be a much more user-friendly experience to only have 2 components to use our application instead of having 3.

I initially started by looking into ways to develop for the phone and found React Native to be the most compatible and easy way to develop for either iPhone or Android. I started by looking into the best platforms to develop with React Native and found Expo.io. However, after trying to install the Bluetooth4 packages, I realized they were not compatible so instead decided to build a React Native application from scratch. After setting up the project, I tried two different Bluetooth packages: react-native-ble-manager and react-native-ble-plx. For react-native-ble-plx, I was able to successfully connect to the Gear VR but was unable to send notifications. I then tried the other package and was able to successfully connect and receive sensor data but was unable to establish a stable connection.

Michael’s Status Report 2

This week, the team decided to pivot to the web-bluetooth approach using a Flask server. Using this method, we were able to easily connect to the GearVR controller, enable notifications, and write commands to characteristics. With this, I started working on a javascript tool for us to visualize GearVR sensor values received in the notification buffer.

I found a javascript chart-creation package named Chart.js which could make different types of plots based on JSON data. Since our project should only register motion and play notes when the trigger is held on the VR controller, I decided to case on the trigger-on notification during the stream of real-time notification buffers. I tried parsing and plotting for the acceleration values first. Once the trigger was pressed, my code would start logging the sensor values, and would plot the x, y, and z values on a line chart once the trigger was released.

In this case, the y-axis is in m/s^2 and the x-axis is in “ticks”, which is approximately .015 seconds (as the average notifications per second was 68.6 notifications). Orange is x, blue is y, and green is z.

Michael’s Status Report 1

This week I worked on connecting to the GearVR controller via Python. This would use the laptop as the central device in the low-energy bluetooth connection with the controller, the peripheral device.

I tested out a several different Python packages that would meet the requirements of:

-Scan for nearby BLE devices and view peripheral services and characteristics
-Connect to device
-Read and Write to characteristics
-Enable and receive notifications from device

I ended up choosing a package named Bleak (https://bleak.readthedocs.io/en/latest/). From there, I was able to scan for the GearVR and view the available services. The usual services found in most bluetooth devices were listed, such as battery life and device info, but there was one service with an un-stringified hex. I assumed that was where the sensor related characteristics were located.

The next step was to connect to the controller. I was only able to successfully connect to the controller using my laptop for approximately 20-30 seconds before the connection was dropped by the controller. I also tried writing code so that I could enable notifications and write commands to characteristics during the time that the connection was stable. The ultimate dagger was when enabling notifications, I would get a BLE error code that ended up being a system error from my operating system. After doing some research online, we found out that it was specifically designed to be connected with Samsung products through their application, so my Windows system trying to enable BLE notifications was getting denied.

I tried another package, PyBluez, which also failed due to the same error. The direct bluetooth connection approach with the laptop wasn’t an option for this particular controller.

Jason’s Status Report 2

This week, I worked on playing sound through Python and trying to implement a wavetable. This proved to be very tedious and an unreliable method of generating sound, as smooth transitions between notes are not guaranteed and this introduced some noticeable latency to the program compared to GarageBand output last week. We decided that we should pivot from generating our own audio, as this introduces too much complexity into our project. We started looking at alternatives to audio generation, hopefully settling on a good one next week. By using a developed audio library, we can free up a lot of time for development of features that we think are more important to this project (user experience, polyphony, motion classification).

Jason’s Status Report 1

This week, I spent time understanding MIDI; its message types, its generation, and its parsing. I wrote some sample code in Python using the mido library to generate MIDI messages, mapping some of my keys to different notes. I then pipelined the MIDI output to GarageBand to observe the latency and smoothness of MIDI generation, and was pleasantly surprised when the latency was basically unnoticeable. Next week, I will work on implementing our own audio generator, maybe with a wavetable, and try to connect it with MIDI generation.

Jason’s Status Report 3

This week, I spent my time mostly working on generating live audio using tone.js and getting familiar with the package’s functionalities. It has all the features we needed, including pitch bends and effects chains. I also spent a lot of time cleaning up the project codebase and dependencies, which was very cluttered due to our frantic past weeks trying to figure out how to implement several different features and in different languages. After cleaning up the code, we created a new repo in GitHub to isolate the environments and packages that we decided to stick with.

Team Status Report 3

This week, we began generating code that outputs sound in real time, using MIDI signals and the tone.js library for audio generation. In parallel, we also began work on motion classification with the GearVR controller.

Before we started work on motion classification, we extended the functionality of the GearVR analysis tool we built last week to include data from all of the sensors (gyroscope, touchpad, and touchpad clicks). This helped us design some classification methods and calibrate our sensors.

Team Status Report 2

This week, we successfully reverse-engineered the Bluetooth protocol for the GearVR controller. Right now, we are able to read all sensor values from the controller (gyroscope, accelerometer, buttons, and touchpad). Before implementing motion detection, we decided to graph out the sensor values for each type of motion we want to classify. This helped us better understand what the data looks like, and will help us in the classification of motion detection.

After establishing a stable connection with the controller, we were also able to generate real-time MIDI messages, controlled by the trigger button on the GearVR controller, and outputted sound through GarageBand. We have not measured the latency yet, but it was hardly noticeable. All in all, we got past a major roadblock this week (figuring out how to use the GearVR controller), and are only slightly behind schedule.

GearVR sensor output