Ishaan’s Status Report 5/8

This week I worked on the final presentation, our poster and demo video. I also made some final touches our Unity game system

  • Added a 3D glove in the game, this will highlight what finger is being provided with haptic feedback for the purpose of the demo
  • Added new fruit variants and upgraded 3D assets
  • Updated our bomb color scheme

We are on schedule

For next week we will be working on tasks to present our final project

Ishaan’s Status Report 05/01/2021

This week I added some of the final touches to our game.

  • Refined the angles and 3D positions of fruits, this helps them appear equally spaced out.
  • Improved the splatter/slicing effect. Now you can clearly see which fruit was cut
  • New effects when different items are cut. Camera shakes when the bomb is cut, make users feel like damage occurred.
  • We prototyped different ideas to demo the haptic feedback and our working on displaying that feedback on the game UI

On schedule 

 

Next week tasks:

  • finish implementing the prototypes for displaying the glove haptic feedback
  • Any final touches to the game and help Arthur and Logan wherever needed.

 

Ishaan’s Status Report 04/24/2021

This week I worked on the following: 

  • Added new fruits – pineapple, apples. As of now we have their 3D assets and I and I am working on displaying graphics of them being cut/
  • Added  a lives counter that gets affected by cutting bombs  bombs
  • worked on improving the mouse 3d object, the mouse has a trail following it like the actual fruit ninja game
  • Added a main menu screen which then direct users to the game
  • Added a pause/play functionality for the game

On schedule 

 

Next week

We need to work on fine tuning the game, add some finishing touches to make the UI/UX better. This can be done by : improving the 3D assets, adding some game effects when a fruit is cut (like camera shaking), adding some sound effects.

Ishaan’s Status Report 4/10/2021

This week I completed the following:

  • Fruit Ninja Unity Tasks:
    • Add Blade to create collisions, follow the mouse
    • Add Fruits – Watermelon and mango
    • Added collision detection to check when fruits are cut
    • Added Physics motion for the fruits – they follow projectile motion
    • Wrote code to fetch input from webcam and display webcam footage on the game. This is the game background and creates an AR effect
    • Created a fruitspawner – Randomly spawns a fruit every .1s – 1s
    • Created bombs and a bombspawner. The bomb is spawned every 2-5 s

Progress Status – on schedule

Tasks for next week:

  • Add new fruits – pineapple, apples
  • Add lives counter, Affected by the bombs
  • Improve the mouse 3d object. Make it seems like a 3D sword on game

 

Ishaan’s Status Report 04/03/2021

This week I worked on the following:

  • Using OpenCV to track colored LEDS. This used color thresholding and background subtraction algorithms
  • Using OpenCV to improve our previous implementation of IR LED tracking, we tried background subtraction  tracking LED blinks. Ultimately found that latency was poor with adding these computation steps
  • Tried using a WII Remote and existing WII remote libraries like cwiid to track an IR LED. We found that this method had poor accuracy and could only find the IR LED in certain orientations
  • Worked on assembling the glove. I researched the schematics we would need to use and helped with connecting the FLORA->mux->haptic motor controllers->motor discs

Progress Status – on schedule

Tasks for next week:

  • complete building a functioning Unity Fruit Ninja game
  • Use 2D fruits and 2D orientation
  • Add Home screen, gameplay screen and end screen
  • Use mouse for user input (if possible use our glove tracking system)

 

Ishaan’s Status Report 03/27/2021

Our team met 3 times this week I worked on the following:

  • Brainstorming how to track the glove. We narrowed it down to the following options:
    • Using IR Leds + IR cameras
    • Using LED + OpenCV color threshold tracking
    • Using an IMU to detect acceleration and predict where the user will reach
  • We developed OpenCV code to track Blue LEDs. This had high accuracy and low latency. However, when background blue colors were introduced the tracker performed poorly
  • To tackle the above problem, I worked with my team to write OpenCV code to track the IR LED. This converted each frame to grayscale and then applied a mask to get pixels in the range of 240-255. After applying an IR film on the camera we were able to have very good performance and there was minimal interference.
  • To completely eliminate any background noise we decided to make the LEDs blink at different frequencies, and are currently trying to identify how to track blinks.

Progress Status – on schedule

Tasks for next week:

  • Explore the option to track blinking IR LEDs. Attempt to develop a method using openCV contours and blinking frequency to identify blinking IR LEDs
  • Explore any other options to eliminate background noise. This might involved adding a new IR LED or introducing a calibration step .
  • Experiment with the IMU, attempt to use it to track acceleration and predict future points

Ishaan’s Status Report 03/13/2021

This week I worked on building the software block diagram below.

Additionally, I worked on designing our data structures, Classes, functions and any interactions in the game. This will enable us to easily develop the software once we have completed glove development.  As a group we decided to put most of our effort on developing the glove with high accuracy and low latency and later focus on Unity development.

I also researched how to use the IMU and found that we might need to do some type of preprocessing before using IMU data. It seems like we might need to use either a Kalman filter or Complementary filter in our preprocessing step, so I worked on developing these filters in Arduino.

 

We are on schedule

 

For next week I will be working on the Design report. I will specifically be working on writing about the software aspects of our project. I will also work on further implementing the preprocessing step for the IMU data and design the calibration step of the IMU-glove.

Ishaan’s Status Report 03/06/21

I worked on the following:

  • Arthur, Logan and I created the block diagram for the overall project and also the hardware schematics.

  • Researched using IMUs instead of OpenCV + LED combination from our last post, we ultimately decided to focus on glove development and use the IMUs
  • Added the assets into unity and started spawning fruits in the environment
  • Worked on code for user fruit interaction and collision detection. Can do basic collision detection and spawn fruit in Unity.

Progress Status – on schedule 

Deliverables for next week: 

  • Add special combos, fruits and bombs
  • Help Arthur with developing the glove and any arduino scripts that we might need to add
  • Work on using the IMU and interpreting the x,y,z coordinates
  • Start researching how to add physics and gravity in the game

Ishaan’s Status Report 02/27/21

I worked on the following:

  • Finding 3D assets for orange, apple, watermelon. Image below is a sample of one of the 3D assets

 

 

 

 

 

 

  • Setting up the Unity development environment on my local machine
  • Installing and getting started on using the following Unity libraries – UnityPhysics, UNet, UnityUI, Unity 2D, Unity Scripting
  • Explored different methods of tracking the user’s LED. The most promising method was using Unity + OpenCV. Based on my research it seems like it might be challenging to write code to make Unity + OpenCV work together, however, an easy alternative might be to pay $95 to use an existing Unity plugin.

 

Progress status – on schedule 

 

Deliverable for next week: 

  • Make existing 3D fruit objects appear in Unity game
  • Basic fruit ninja logic functions. Like spawn fruit
  • Researching methods to track the user LED through Unity

Ishaan’s Status Report 02/21/21

 

I worked on designing our solution and how each component in our project would interact. The figure below demonstrates how the different game components will interact.  

For the glove component I researched different micro controllers would best suit our use case and what sensors, motors we need for haptic feedback[1] [2].

 

Understanding Technical risks: 

I researched different attempts of building fruit ninja in VR and found that translating the 2D visuals to 3D while maintaining a responsive user interface was the most demanding aspect of the project. [3]

Progress Status – On Schedule

Deliverables for next week: 

  1. Setting up the Unity Game environment
  2. Understanding the requirements for 3D modeling and building our fruits
  3.  Designing object/input tracking framework

References: 

  1. https://www.instructables.com/Haptic-Glove-for-the-Blind/
  2. https://hackaday.io/project/160405-diy-haptic-glove-for-vr
  3. https://www.jamesquickportfolio.com/fruit-ninja-vr