Team Status Report 5/8

Significant risks:

  • 3D-printed enclosure for the Raspberry Pi system will not be complete before tomorrow when we are taking pictures of our completed system
  • We currently have 2 functioning haptic feedback disks, reduced from 3 in the design.

Changes Made: 

  • We now have a visual indicator of our glove inside the Unity game that shows where haptic feedback is happening, which will make it easier to demo the feature virtually
  • We are building a 3D-printed enclosure for the Raspberry Pi to improve the looks of our final system.

Schedule:

We are on schedule for the upcoming deadlines this week.

Team Status Report for 05/01/21

Significant risks:

  • player might not want to play the game with their own background, we need to add a default background that the game can be played on
  • might be some latency between the game sending feedback to the FLORA via bluetooth for the haptic feedback motors

Changes Made: 

  • our own 3D assets (fruits & bombs)
  • pause menu & menu screen
  • calibration step before playing the game (so that Unity can find the glove)
  • haptic feedback: sequence of vibrations based on different game states (fruit cut, bomb exploded, fruit missed…)

Schedule:

We are on schedule. While we will need to do some fine-tuning over the course of next week, we will be ready for both the upcoming presentation and the final demo.

Team Status Report 4/24

Significant risks:

  • After the ethics discussion we realized we should make our game compatible for colorblind players. It is possible that they will not be able to differentiate between some fruit.
  • Users might want to pause/play/restart the game
  • Occasionally the tracking incorrectly identifies the glove, in these cases the user should have the ability to calibrate the glove
  • When a fruit is cut the sliced objects move too far apart, users might not understand that they have cut a fruit due to the existing game physics
  • There is a risk of some latency between when a fruit is cut and when the haptic motors vibrate

Changes Made: 

  • Added a menu screen to the game, Users can also see their score and tap the pause/play button while playing
  • We are trying out adding a calibration step before the game starts, if we notice a significant improvement with this step we will add it to our final project.
  • In order to minimize latency we are writing a script in unity to send the “trigger feedback” signal to the flora
  • To improve the fruit slicing experience and physics we are now looking into building our own 3d assets in blender and/or trying to find existing ones. The challenge is that we need cut and whole versions of each fruit

Schedule

We are back on schedule. Our MVP is complete and plan on using our remaining time to improve the UI/UX of the system. Specifically working on improving the game interface and graphics. If time permits we will then work on further improving the glove tracking.

Team Status Report 4/10

Significant Risks

  • Motion tracking is functional but needs smoothing. Sometimes there is a jittering effect when holding the glove still.
  • We solved the previous risk of IR LEDs needing a certain orientation to be detected by reducing the resistance of the resistor, and now they are bright from all angles.
  • From the very beginning of the project we decided to focus mostly on the glove, and the AR aspect was intentionally left as a ‘nice to have’. We are able to plug a webcam into the laptop and show the game with a background of the player’s living room for example, but it is not very intuitive. It would be better if the game was being displayed on true AR goggles, but this was not in our original scope so it is unlikely that we will attempt to add that.

Changes Made

  • We actually changed back to infrared LED tracking with a Gaussian blur to differentiate it from other bright objects in the room. Currently motion tracking works very well in low-light rooms.
  • We are back to our original plan of the game running on the laptop. The player simply then plugs in the Raspberry Pi system over USB and the data is transmit via a serial connection, as outlined in our original design.

Schedule

We are back on schedule. Our MVP is nearing completion for the demo this week. For the remainder of the semester we will be using the pre-planned slack time to further improve our system. This includes work on the tracking algorithms to work better in rooms with other light sources, improvements to the game interface itself, as well as integrating IMU data into our motion tracking to develop a predictive algorithm.

Team Status Report for 04/03/21

Significant Risks

  • Background substitution in OpenCV adds a lot of latency and delay in the movement seen on screen.
  • Orientation of IR LEDs seem to be a problem. The LEDs apparently need to be pointed directly at the camera and this is unrealistic considering the game consists of swiping across the camera (we can’t expect the player to keep the LED pointed at the camera perfectly).

Changes Made

  • Use regular LEDs since the latency is much better (no need to do gray-scaling and use the IR LEDs as white LEDs if we can simply use color LEDs). The intensity of colored LEDs are also stronger so the camera will be able to pick it up at larger distances
  • We are considering running the whole Fruit Ninja game on the RaspberryPi but we have not yet finalized this

Schedule

A little behind on schedule, but will be caught up by the time we arrive at the demo since we plan to have a basic game and the glove fully built.

Team Status Report 3/27/2021

Our team met 3 times this week and we focused on building and tracking the glove controller

The image below demonstrates our code capable of tracking an LED.

 

Significant risks 

  • The IR LED needs to be directly pointed towards the camera for tracking. Depending on how the users orient the glove the tracking accuracy can be impacted.
  • The accuracy of the tracking using computer vision is affected by any background light or colors.

 

Changes made:

  • We have decided to use multiple IR LEDs on the glove. This will eliminate the problem of LED orientation.
  • Since we have decided to use multiple IR LEDs we intend on making each IR LED blink at a particular frequency. This will help to uniquely identify LEDS.
  • We intend on using an IR filter on the cameras to eliminate any background colors or noise.

Schedule – No Changes

Team Status Report 3/13/2021

Significant Risks:

The largest risk we face currently is an underdeveloped plan for position tracking. We are considering using IMUs for this, however, IMUs drift at a quadratic rate (since they work on double integrals). We may need to use some sensor fusion with a different device (perhaps optical) provides the point of reference to account for drift.

Changes made:

The main change is the position tracking system, which is no longer going to be solely an LED + camera. This decision will be finalized this week as part of the design report.

Schedule – On Track. We switched to a new project management system (ClickUp) and modified our schedule to accommodate the needs for IMU testing.

Team Status Report for 03/06/21

will edit this and re-publish on saturday 03/06

Significant Risks:

  • As soon as the last parts for the glove arrive, we can begin assembling it. This is the first stage of our project but crucial as it determines whether we will even be able to continue with the project or not (if no controller for the game, then how do you play the game).
    • are all the parts going to fit on the glove?
    • is it going to be too heavy?
    • are all of the inter-connections between all of the parts going to work? (we will need to test this)

Changes made:

  • Some design details which are highlighted in the new block diagram that we worked on this week:

Schedule – No Changes (entering the glove assembling stage)

Team Status Report 02/27/21

Significant Risks:

  • Creating an AR environment for our game. It is crucial for us to decide if we intend on using a regular background or the user’s living room as a background
  • Tracking the LED glove using OpenCV + Unity. As of now our best option is to spend $95 and use the Unity module to track the input, we need to make a decision if we intend on going forward with this or writing our own library to enable Unity – OpenCV communication

Changes made:

  • Based on feedback from our project proposal presentation we are quantifying our requirements and testing strategy
  • Using Blender and sketchup.com for our 3D assets, using readymade assets  will reduce time spent building our own assets

Schedule – No Changes

Team Status Report 02/21/21

Significant risks: 

Calibrating the glove to work with the Fruit Ninja environment and using Unity to wirelessly communicate with the glove are current technical challenges to our project. Additionally, ensuring the glove input is accurate and there is minimum delay are crucial to a smooth user experience.

If we are unable to set up the glove input our project will be a 3D Augmented Reality fruit ninja user’s can play using a computer mouse.

Changes made to the design: 

We pivoted from our original idea of building a Virtual Reality application using an FPGA. Since our original project did not have a specific use case we changed our project to a popular game that has not been recreated in Augmented Reality. This change significantly reduces our budget and increases our interest.

Updated Schedule: