Team Status Report for 03/26

As of the end of this week, the individual components of the MVP are completed and next week we are going to finish integrating the components together.  The IMU isn’t a part of the MVP, but there has been considerable progress in making the data output more consistent and accurate.  The game itself has been completed too.  What’s new from this week are the multiplayer and bluetooth capabilities that allow two players to play each other and connect their paddles to the game.  The computer vision algorithm for tracking the lateral position of the paddle is about completed as well.  Looking ahead, the plan for next week is to send the lateral position of the paddle to the Google Cardboards running the game, so that two players can control their respective paddles in a game of Pong.  We are projecting there to be a few calibration adjustments to allow for centering of the paddle and we are going to test for the latency of the paddle movement as well.  For this week, no changes were made to the product design.

The biggest risk that we face is buying a wifi card that either isn’t compatible with the libraries we are using or the Jetson Nano itself.   While this risk is mitigated with good product research, a contingency plan we have is to buy multiple wifi cards at a time so we have backups in case one fails.

Logan’s Status Report for 3/26

This week I continued to work on the CV with our new strategy of color recognition as opposed to the previous strategy of contours. Around the middle of the week I ran into some issues but I reached out to Professor Savvides for help and he gave me some tips about hue/HSV and pointed me to some resources which continue to be helpful.

Despite the unforeseen challenges this portion of the MVP should be done by Monday so we can proceed with integrating next week.

William’s Status Report for 3/26

This week, I was able to resolve the issues that I had last week regarding weird behaviors with the IMU readings and the resulting roll, pitch, and yaw. I realized one issue was that in my driver, I had initialized a register to the wrong value, which meant my readings were coming in at the wrong magnitude. In addition, I realized that the inaccuracies in the roll, pitch, and yaw were a result of the raw sensor readings drifting at the start, so I added a calibration function at the start to calibrate all the readings, which has helped a lot with the sensor accuracy. The sensor is now fairly accurate with the roll, pitch, and yaw, and when I tested it by manually moving the IMU around and plotting the sensor output, I’m getting that the orientation is generally within 10 degrees of the expected values.

I also tried to add some functionality so that I could leverage accelerometer data in conjunction with the roll, pitch, and yaw to provide some position data that can cross-validate our CV data. To do so, I would have to take the integral of the acceleration and then velocity to get the displacement, and find a way to translate the readings from relative to the IMU axes to relative to Earth’s absolute orientation. This is still a work in progress, but so far, I have done some research and found a way to filter out the noise in our sensor readings using a lowpass butterworth filter. I will have to now figure out how to leverage the smoother filtered data to generate some sort of positional data.

Moving forward, in addition to getting position data, when our CV subsystem is functional, I can begin integrating the subsystems together with my team.

Henry’s Status Report for 03/26

This week I was able to reach my goals of implementing the multiplayer capabilities and adding Bluetooth communication code to give the VR headset the ability to interact with the paddle’s rPi and the camera’s Jetson.  The only thing that I am waiting on right now is the wifi card for the Jetson which I am going to order this weekend.  The wifi card is necessary for bluetooth connection between the Jetson and the Google Cardboard.

The multiplayer capabilities took a lot longer for me to implement than I expected.  Initially, I used an open source library called Riptide Networking to program this, but the problem was that it was a lot more low-level than I wanted it to be, which means more unnecessary work.  An example would be that I would have to explicitly code what information I wanted to send from a server to a client rather than the library taking care of that for me.  I spent a lot of time researching and starting the code using the Riptide library when I decided to change to a multiplayer library called Mirror, which does a lot more work for me than Riptide.  Mirror does a lot of client-server communication, so instead of me explicitly telling the server what to send to the client, Mirror automatically send the server’s game state to the client by itself.  The biggest thing I had to do with Mirror was program the spawn parameters for when players entered a game.  This includes limiting the game to have only two players,  making sure two players spawned on different ends of the table, and starting the game when two players connected to the game.  Another cool thing with Mirror is that the server and client run the same game, all I need to do is assign a computer as a server and the other two computers as clients.  This saved me a lot of time because with Riptide, I would’ve had to create a client version of the game where it read information from the server and reflected it in the graphics.

Going off our Gantt chart my progress is on time, I have about a week to get Logan’s CV information incorporated into my game, which shouldn’t take much time at all.  The biggest risk right now is how fast the wifi card can come in and how quickly we can get it working with the Jetson.  Any free time I have can be spent working on incorporating better graphics and finalizing the UI of the game.

Logan’s Status Report For 3/19

This week I continued with the CV portion of our project and had success on detecting position in most cases, but we ran into considerations about how the cameras would recognize the paddle in different orientations. We decided to change our strategy from contours to color recognition which would allow more flexibility in recognition if the paddle was closer to parallel with the floor. This strategy would also open up the option of coloring the edge of the paddle so that could be recognized as well.

This week I will go forward with this path and hopefully finish it up so I can turn my attention to more nuanced issues with the CV or assist on other subsystems.

 

Team Status Report for 03/19

By far the biggest risk that came up this past week was the issue of accuracy of the position and orientation of the paddle.  The IMU was very inaccurate at the beginning of the week as the orientation would off by a lot, sometimes as bad as 45 degrees.  However, William was able to fine tune the device so that the orientation data was realistic from eyeballing the orientation of the IMU.  We are still going to work through some smaller problems with the IMU that William mentioned in his report as well as do more extensive testing to get quantitative data as to how accurate the IMU is.  Right now, if any issues with the IMU come up, our contingency plan is to either supplement the open source code we downloaded with our own code or to find a new source that has the features we need.   There are also accuracy issues with the camera as well.  Right now, Logan is working on an OpenCV algorithm that creates a bounding box around the paddle and we just take the center of that bounding box as the position the paddle needs to be in.  There’s definitely concern about how accurate this model is, but we believe that this algorithm is accurate enough that it is not noticeable by the player.  As with the IMU, once this algorithm is completed, we need to get quantitative data of how well the camera can track static and moving objects.

One change in our final design is that the game will be run on the server.  Something that came up during our design of the multiplayer capabilities of the game was that our old implementation had two players running the same game on two different devices.  We realized that synchronizing two different devices to show the same game would be too complicated and too risky, so we decided to centralize the game on a server to take away the synchronization headache.  We don’t see any cost in this new solution as the servers we were planning on using for our MVP and final implementation should be able to handle this extra workload.

As of right now, everything is on schedule and we may be looking at completing our MVP as early as the end of next week, assuming no unforeseen problem arise.  The shift of having our game be run on a centralized server is a minimal effort change for us that should not take more than an hour to fix.

William’s Status Report for 3/19

This week, I got the IMU up and running with the Raspberry Pi. This included physically soldering the connections between the IMU and the Raspberry Pi, as well as writing a device driver for the IMU on the Raspberry Pi. The device driver allows me to get readings from the sensor. I also performed testing of the device driver by getting the sensor readings of the accelerometer and gyroscope while moving the IMU around and displaying/plotting the output to see the accuracy of the IMU. Furthermore, I tested out various open-source implementations of the Madgwick Algorithm, which helps convert accelerometer and gyroscope data into roll, pitch, and yaw.  I performed testing of the Madgwick Algorithm using plotting as well. There are still some weird behaviors with the algorithm that I have to figure out how to fix, such as the roll/pitch/yaw numbers changing despite no movement.

In addition, I finalized my strategy for powering the raspberry pi wirelessly, and came upon the solution of using a power bank. The reason for this is some other products I looked at would be assembled on top of the current pi, which would make the construction a little too tall and thus unfeasible for a ping pong paddle. With a power bank, it can be laid out flat with the pi, thus keeping the paddle at a reasonable thickness.

For the next week, I can work on getting the IMU data more accurate, as well as beginning to potentially integrate the IMU data with other components to get some idea of how our system will look as a whole. This will involve using bluetooth to send data to the VR headset, as well as seeing how the network transmissions may look.

Henry’s Status Report for 03/19

This past week I was working on creating the graphics and game design for our MVP as well as beginning to work on my deliverables for next week.  What I need to do for the MVP is to have the graphics, game interactions, rules, and multiplayer/bluetooth capabilities implemented.  This past week, I was able to get the graphics, game interactions, and rules implemented on Unity, so right now the game looks like a 3D version of pong where two players can control the paddles using the left/right and a/d keys.   The game can also be imported onto an Android and iOS device to be played on a Google Cardboard.  I also began work on creating the multiplayer aspect of the game so that two different devices can play a game against each other rather than having two players play on one device.  I found the necessary libraries and tutorials to integrate that functionality so all I have to do now is implement it and test it.  For next week, the goal is to finish implementing the multiplayer functionality as well as adding Bluetooth capabilities to get data from our cameras and paddles.  By the end of next week, the game should be completely implemented and all that would need to be done to finish the MVP would be plugging in the data from Will and Logan, which should be really easy.  Everything so far seems to be going according to schedule.  I envision there being some debugging and testing time with the bluetooth functionality, but my goals are definitely realistic for next week.