Ishaan’s Status Report 5/8

This week I worked on the final presentation, our poster and demo video. I also made some final touches our Unity game system

  • Added a 3D glove in the game, this will highlight what finger is being provided with haptic feedback for the purpose of the demo
  • Added new fruit variants and upgraded 3D assets
  • Updated our bomb color scheme

We are on schedule

For next week we will be working on tasks to present our final project

Ishaan’s Status Report 05/01/2021

This week I added some of the final touches to our game.

  • Refined the angles and 3D positions of fruits, this helps them appear equally spaced out.
  • Improved the splatter/slicing effect. Now you can clearly see which fruit was cut
  • New effects when different items are cut. Camera shakes when the bomb is cut, make users feel like damage occurred.
  • We prototyped different ideas to demo the haptic feedback and our working on displaying that feedback on the game UI

On schedule 

 

Next week tasks:

  • finish implementing the prototypes for displaying the glove haptic feedback
  • Any final touches to the game and help Arthur and Logan wherever needed.

 

Ishaan’s Status Report 04/24/2021

This week I worked on the following: 

  • Added new fruits – pineapple, apples. As of now we have their 3D assets and I and I am working on displaying graphics of them being cut/
  • Added  a lives counter that gets affected by cutting bombs  bombs
  • worked on improving the mouse 3d object, the mouse has a trail following it like the actual fruit ninja game
  • Added a main menu screen which then direct users to the game
  • Added a pause/play functionality for the game

On schedule 

 

Next week

We need to work on fine tuning the game, add some finishing touches to make the UI/UX better. This can be done by : improving the 3D assets, adding some game effects when a fruit is cut (like camera shaking), adding some sound effects.

Team Status Report 4/24

Significant risks:

  • After the ethics discussion we realized we should make our game compatible for colorblind players. It is possible that they will not be able to differentiate between some fruit.
  • Users might want to pause/play/restart the game
  • Occasionally the tracking incorrectly identifies the glove, in these cases the user should have the ability to calibrate the glove
  • When a fruit is cut the sliced objects move too far apart, users might not understand that they have cut a fruit due to the existing game physics
  • There is a risk of some latency between when a fruit is cut and when the haptic motors vibrate

Changes Made: 

  • Added a menu screen to the game, Users can also see their score and tap the pause/play button while playing
  • We are trying out adding a calibration step before the game starts, if we notice a significant improvement with this step we will add it to our final project.
  • In order to minimize latency we are writing a script in unity to send the “trigger feedback” signal to the flora
  • To improve the fruit slicing experience and physics we are now looking into building our own 3d assets in blender and/or trying to find existing ones. The challenge is that we need cut and whole versions of each fruit

Schedule

We are back on schedule. Our MVP is complete and plan on using our remaining time to improve the UI/UX of the system. Specifically working on improving the game interface and graphics. If time permits we will then work on further improving the glove tracking.

Ishaan’s Status Report 4/10/2021

This week I completed the following:

  • Fruit Ninja Unity Tasks:
    • Add Blade to create collisions, follow the mouse
    • Add Fruits – Watermelon and mango
    • Added collision detection to check when fruits are cut
    • Added Physics motion for the fruits – they follow projectile motion
    • Wrote code to fetch input from webcam and display webcam footage on the game. This is the game background and creates an AR effect
    • Created a fruitspawner – Randomly spawns a fruit every .1s – 1s
    • Created bombs and a bombspawner. The bomb is spawned every 2-5 s

Progress Status – on schedule

Tasks for next week:

  • Add new fruits – pineapple, apples
  • Add lives counter, Affected by the bombs
  • Improve the mouse 3d object. Make it seems like a 3D sword on game

 

Ishaan’s Status Report 04/03/2021

This week I worked on the following:

  • Using OpenCV to track colored LEDS. This used color thresholding and background subtraction algorithms
  • Using OpenCV to improve our previous implementation of IR LED tracking, we tried background subtraction  tracking LED blinks. Ultimately found that latency was poor with adding these computation steps
  • Tried using a WII Remote and existing WII remote libraries like cwiid to track an IR LED. We found that this method had poor accuracy and could only find the IR LED in certain orientations
  • Worked on assembling the glove. I researched the schematics we would need to use and helped with connecting the FLORA->mux->haptic motor controllers->motor discs

Progress Status – on schedule

Tasks for next week:

  • complete building a functioning Unity Fruit Ninja game
  • Use 2D fruits and 2D orientation
  • Add Home screen, gameplay screen and end screen
  • Use mouse for user input (if possible use our glove tracking system)

 

Ishaan’s Status Report 03/27/2021

Our team met 3 times this week I worked on the following:

  • Brainstorming how to track the glove. We narrowed it down to the following options:
    • Using IR Leds + IR cameras
    • Using LED + OpenCV color threshold tracking
    • Using an IMU to detect acceleration and predict where the user will reach
  • We developed OpenCV code to track Blue LEDs. This had high accuracy and low latency. However, when background blue colors were introduced the tracker performed poorly
  • To tackle the above problem, I worked with my team to write OpenCV code to track the IR LED. This converted each frame to grayscale and then applied a mask to get pixels in the range of 240-255. After applying an IR film on the camera we were able to have very good performance and there was minimal interference.
  • To completely eliminate any background noise we decided to make the LEDs blink at different frequencies, and are currently trying to identify how to track blinks.

Progress Status – on schedule

Tasks for next week:

  • Explore the option to track blinking IR LEDs. Attempt to develop a method using openCV contours and blinking frequency to identify blinking IR LEDs
  • Explore any other options to eliminate background noise. This might involved adding a new IR LED or introducing a calibration step .
  • Experiment with the IMU, attempt to use it to track acceleration and predict future points

Team Status Report 3/27/2021

Our team met 3 times this week and we focused on building and tracking the glove controller

The image below demonstrates our code capable of tracking an LED.

 

Significant risks 

  • The IR LED needs to be directly pointed towards the camera for tracking. Depending on how the users orient the glove the tracking accuracy can be impacted.
  • The accuracy of the tracking using computer vision is affected by any background light or colors.

 

Changes made:

  • We have decided to use multiple IR LEDs on the glove. This will eliminate the problem of LED orientation.
  • Since we have decided to use multiple IR LEDs we intend on making each IR LED blink at a particular frequency. This will help to uniquely identify LEDS.
  • We intend on using an IR filter on the cameras to eliminate any background colors or noise.

Schedule – No Changes

Ishaan’s Status Report 03/13/2021

This week I worked on building the software block diagram below.

Additionally, I worked on designing our data structures, Classes, functions and any interactions in the game. This will enable us to easily develop the software once we have completed glove development.  As a group we decided to put most of our effort on developing the glove with high accuracy and low latency and later focus on Unity development.

I also researched how to use the IMU and found that we might need to do some type of preprocessing before using IMU data. It seems like we might need to use either a Kalman filter or Complementary filter in our preprocessing step, so I worked on developing these filters in Arduino.

 

We are on schedule

 

For next week I will be working on the Design report. I will specifically be working on writing about the software aspects of our project. I will also work on further implementing the preprocessing step for the IMU data and design the calibration step of the IMU-glove.

Ishaan’s Status Report 03/06/21

I worked on the following:

  • Arthur, Logan and I created the block diagram for the overall project and also the hardware schematics.

  • Researched using IMUs instead of OpenCV + LED combination from our last post, we ultimately decided to focus on glove development and use the IMUs
  • Added the assets into unity and started spawning fruits in the environment
  • Worked on code for user fruit interaction and collision detection. Can do basic collision detection and spawn fruit in Unity.

Progress Status – on schedule 

Deliverables for next week: 

  • Add special combos, fruits and bombs
  • Help Arthur with developing the glove and any arduino scripts that we might need to add
  • Work on using the IMU and interpreting the x,y,z coordinates
  • Start researching how to add physics and gravity in the game