William’s Status Report for 4/30

This week, I was able to perform steps to integrate with Henry’s game engine. First of all, I solved some technical difficulties by finding a new library for the Madgwick Algorithm in python instead of C++, so that I could run the entire program in one Python process instead of worrying about running the bluetooth python script and the C++ sensor fusion processes in parallel. I tested the library and tuned the beta constant in that library so that the changes to the IMU orientation would reflect quickly enough. I also researched and found some code that removed the acceleration due to gravity from our IMU acceleration measurements, which allows for the acceleration data to be centered around 0 no matter what orientation the paddle is in.

Armed with those improvements, I was able to integrate my bluetooth functionality into the IMU program to connect with Henry’s game engine. We worked on tuning the thresholds for the paddle swing, including the thresholds for orientation and acceleration, as well as the amount of time we sample swing data over. After observing that data, we have found thresholds that work.

Next week, I will continue performing further tests on the integration, as well as work on the final report and the demo video.

William’s Status Report for 4/23

This week, I continued trying to push my end of the integration to-do list to prepare for the integration. I tried a few different design concepts to implement the IMU side of the communications, which involves gathering data in a specified time frame, and then using the data to determine a swing type that will be sent to the game engine via bluetooth.

One concept that I tried was to accomplish the data collection and analysis entirely in the same C++ program that reads IMU data. For now, this idea is in the back seat, as I could not find a reliable C++ timer mechanism that will track time for us. Thus, I resorted to just moving the data collection to the same python script that will handle bluetooth connections. To do so, the Rpi will run two concurrent processes, one running the IMU data reading and Madgwick algorithm, and the other taking in that data as input through a unix pipe and then when the time is right, analyzing a window of that data. There was a slight issue of how the python timer jumps by quarter second intervals occasionally, which can throw of data collection, so I tried to adjust the implementation of data collection to rely on the python timer as little as possible, so that the error cannot propagate through different stages of the data analysis process.

For the data analysis process, during the swing window which is of differential time, I am taking the average orientation of the paddle during that window to establish what type of swing (lob, smash, topspin, slice) is being executed. To determine the speed of the paddle, and whether the paddle is swung at all, I am using the acceleration data. Through empirical testing, I found that the acceleration in the Z-axis of the IMU has an upward peak followed by a downward peak during a forehand, and a downward peak followed by a upward peak during a backhand. Thus, the goal is to identify if such peaks take place during the swing window, which can help us decide whether the swing occurred in time and what direction and power it was swung with. To help eliminate noise, I used a Scipy butterworth filter to smooth out the data, and then to determine the peak, we can first track what the acceleration reads before the window, and then integrate the values during the window to see what direction of peak we have. This is something that will have to be tuned going forward by testing it with the game engine in real time.

I also used CAD to design and subsequently 3d-print a box to hold our IMU system, which will be attached to a ping-pong paddle for gameplay.

William’s Status Report for 4/16

This week, I spent a lot of time trying to devise a way to get velocity data relative to the Earth’s axes from the IMU. I tried using a SciPy library skinematics, which uses analytical methods to find the orientation and positions. After lots of testing on that library, I realized that the madgwick algorithm I had been using is markedly more accurate than the skinematics analytical method for finding orientations. In addition, I tried implementing a technique for using my original madgwick algorithm and combining it with the SciPy position finding algorithm. However, after testing that method, I came to the conclusion that the data we were getting from that algorithm was not accurate enough for the purposes of our game.

With that, I met with Henry to discuss a pivot in our plan so that we could have a game that would not rely heavily on position/velocity data from the IMU. We redefined our game so that it would have a finite set of swing types and ball flights that would be easier to define as a function of just IMU orientation data and raw acceleration data relative to the IMU’s axes. For example, we can determine whether a swing is forehand or backhand  by checking whether the integral of acceleration over time in the IMU’s z-axis is positive or negative, without needing to know what the acceleration is relative to Earth’s axes. In addition, we can determine whether the shot is a lob or smash by checking the roll of the paddle from the IMU orientation, and also whether there is spin by checking the acceleration along the IMU’s X and Y axes. Based on the type of swing, we can then determine what type of ball flight we have.

With this infrastructure defined quite rigorously, this week and next week I will work on doing a large sample of different swing types, gathering the data, and defining thresholds that will help us determine what swing type and ball flight we have based on the paddle sensor data. The goal is to have well-defined thresholds that can make the user gameplay relatively simple. In addition, I will try to 3-d print a paddle with the sensors embedded in them, and continue the process of integrating my sensor unit with the game engine.

William’s Status Report for 4/9

This week, I was able to finally get the Bluetooth module up and running on my raspberry pi. The pi can now listen for connections on startup, and then once a device connects to the pi, the pi can send data from the IMU to that device. I tested this with a bluetooth terminal on my phone, and the bluetooth has been reliable. When I tested with the Unity module with Henry, however, it has not consistently connected, so we still need to trouble shoot why that is the case. Moving forward, I will need to also test how the latency works in real time with the sending of data between devices.

I have also begun looking into ways to extract velocity and maybe even position data from the IMU on top of the current orientation data. I am researching how to use quaternions and rotation matrices to make the IMU data relate to the global axes as opposed to just the IMU axes. This is something I will have to look into more, and then also implement and test moving forward.

 

William’s Status Report for 4/2

This week, I began trying to integrate the bluetooth on raspberry pi so it can send data to the VR device and server for the MVP. I’ve found some Rpi resources like bluez and bluetoothctl that can be used, but I’m still facing some technical difficulties connecting to devices, as sometimes the connection does not stay after pairing. This is something I am still looking into still. I also researched the mechanism for sending data, and I think I have found the plan of writing to a file that will be read by the bluetooth module. I plan to test out the data transfer using a bluetooth terminal app on my phone when I get the bluetooth working.

My goal is to get the bluetooth up and running as soon as possible next week so I can integrate it with the other subsystems and begin seeing how the IMU data translates to the graphics.

William’s Status Report for 3/26

This week, I was able to resolve the issues that I had last week regarding weird behaviors with the IMU readings and the resulting roll, pitch, and yaw. I realized one issue was that in my driver, I had initialized a register to the wrong value, which meant my readings were coming in at the wrong magnitude. In addition, I realized that the inaccuracies in the roll, pitch, and yaw were a result of the raw sensor readings drifting at the start, so I added a calibration function at the start to calibrate all the readings, which has helped a lot with the sensor accuracy. The sensor is now fairly accurate with the roll, pitch, and yaw, and when I tested it by manually moving the IMU around and plotting the sensor output, I’m getting that the orientation is generally within 10 degrees of the expected values.

I also tried to add some functionality so that I could leverage accelerometer data in conjunction with the roll, pitch, and yaw to provide some position data that can cross-validate our CV data. To do so, I would have to take the integral of the acceleration and then velocity to get the displacement, and find a way to translate the readings from relative to the IMU axes to relative to Earth’s absolute orientation. This is still a work in progress, but so far, I have done some research and found a way to filter out the noise in our sensor readings using a lowpass butterworth filter. I will have to now figure out how to leverage the smoother filtered data to generate some sort of positional data.

Moving forward, in addition to getting position data, when our CV subsystem is functional, I can begin integrating the subsystems together with my team.

William’s Status Report for 3/19

This week, I got the IMU up and running with the Raspberry Pi. This included physically soldering the connections between the IMU and the Raspberry Pi, as well as writing a device driver for the IMU on the Raspberry Pi. The device driver allows me to get readings from the sensor. I also performed testing of the device driver by getting the sensor readings of the accelerometer and gyroscope while moving the IMU around and displaying/plotting the output to see the accuracy of the IMU. Furthermore, I tested out various open-source implementations of the Madgwick Algorithm, which helps convert accelerometer and gyroscope data into roll, pitch, and yaw.  I performed testing of the Madgwick Algorithm using plotting as well. There are still some weird behaviors with the algorithm that I have to figure out how to fix, such as the roll/pitch/yaw numbers changing despite no movement.

In addition, I finalized my strategy for powering the raspberry pi wirelessly, and came upon the solution of using a power bank. The reason for this is some other products I looked at would be assembled on top of the current pi, which would make the construction a little too tall and thus unfeasible for a ping pong paddle. With a power bank, it can be laid out flat with the pi, thus keeping the paddle at a reasonable thickness.

For the next week, I can work on getting the IMU data more accurate, as well as beginning to potentially integrate the IMU data with other components to get some idea of how our system will look as a whole. This will involve using bluetooth to send data to the VR headset, as well as seeing how the network transmissions may look.

William’s Status Report for 2/26

This week, I spent quite a few hours trying to get my raspberry pi up and running. I had a lot of technical difficulties figuring out how to SSH into the device without Ethernet cables, but eventually, I was able to figure out how to SSH using the Pi’s Wifi. Because those technical difficulties took me so long, I haven’t really done anything else with that yet. Thus, I am behind my goal of last week, but I should have more time next week and over spring break to get some good work done.

William’s Status Report for 2/19

For this week, I helped our team further our design process. I worked on creating our design diagrams that would depict how our subsystems would work individually as well as how they would interface with each other. I also tried to research ways to power up our paddle wirelessly. After doing some research about manually building voltage converters, I came across some products that are specifically built to convert to 5 volts from various different voltages, so I think eventually, we can try to acquire some of those products. Our sensors arrived this week, but we were missing SD cards, so I had to wait for that order to come through, and I was able to pick up the SD card towards the end of the week. With that, I should be able to begin playing around with the sensors and the Raspberry Pi next week.

I didn’t quite meet last week’s goal because our materials haven’t fully arrived. However, I think that we should still be on track as a team as my goal last week could have been a little ambitious, seeing that we still had to hash out some design issues. That said, I think now that our materials have arrived, I can begin tinkering with them so that our team will keep pace.

William’s Status Report for 2/12

This week, I completed researching the possible technologies that our team could use for tracking our paddle using Inertial Measurement Units. My conclusion was that we would use the 6-degree-of-freedom MPU6050 IMU connected to a Raspberry Pi Zero Wireless through I2C. I also delved into researching ways to bring up this sensing system, and found several useful sites about how to interface the sensor with the Raspberry Pi, including how to set up the I2C connection, how to write a driver for the IMU on the Raspberry Pi, and also how we can write and run our own software on the Raspberry Pi to process the data from the sensor. I also researched and found existing open-source libraries that run the Madgewick Algorithm, which is a sensor fusion algorithm that will us interpret the sensor data relative to Earth’s axes. Lastly, I also researched potential servers we could use as a game server, and found that we could potentially leverage Amazon Web Service’s Gamelift product, which provides a low-latency game server SDK for Unity that we can integrate into our game. I discussed a plan with our team on how we can attempt to optimize our communications between our sensing system, VR device, and server to minimize latency.

As of now, I think I am perfectly on schedule, as we have ordered our parts and I have good preparation on how I intend to bring up the system. Moving forward into next week, my goal is to begin integrating the IMU with the Raspberry Pi. I hope to figure out how to physically connect the IMU to the Raspberry Pi, and subsequently write a driver on the RPi for the IMU so that we can read the raw data from the IMU into the Raspberry Pi in the next week, so that from there, we can begin figuring out how to process that raw data. This is contingent on our IMU sensor arriving, so hopefully, it will be here by next week. If it doesn’t arrive, I should still be able to get my hands on the Raspberry Pi as we have requested it from CMU inventory, and thus, at the very least, I can try to set up an I2C driver on that.