Ishaan’s Status Report 04/24/2021

This week I worked on the following: 

  • Added new fruits – pineapple, apples. As of now we have their 3D assets and I and I am working on displaying graphics of them being cut/
  • Added  a lives counter that gets affected by cutting bombs  bombs
  • worked on improving the mouse 3d object, the mouse has a trail following it like the actual fruit ninja game
  • Added a main menu screen which then direct users to the game
  • Added a pause/play functionality for the game

On schedule 

 

Next week

We need to work on fine tuning the game, add some finishing touches to make the UI/UX better. This can be done by : improving the 3D assets, adding some game effects when a fruit is cut (like camera shaking), adding some sound effects.

Team Status Report 4/24

Significant risks:

  • After the ethics discussion we realized we should make our game compatible for colorblind players. It is possible that they will not be able to differentiate between some fruit.
  • Users might want to pause/play/restart the game
  • Occasionally the tracking incorrectly identifies the glove, in these cases the user should have the ability to calibrate the glove
  • When a fruit is cut the sliced objects move too far apart, users might not understand that they have cut a fruit due to the existing game physics
  • There is a risk of some latency between when a fruit is cut and when the haptic motors vibrate

Changes Made: 

  • Added a menu screen to the game, Users can also see their score and tap the pause/play button while playing
  • We are trying out adding a calibration step before the game starts, if we notice a significant improvement with this step we will add it to our final project.
  • In order to minimize latency we are writing a script in unity to send the “trigger feedback” signal to the flora
  • To improve the fruit slicing experience and physics we are now looking into building our own 3d assets in blender and/or trying to find existing ones. The challenge is that we need cut and whole versions of each fruit

Schedule

We are back on schedule. Our MVP is complete and plan on using our remaining time to improve the UI/UX of the system. Specifically working on improving the game interface and graphics. If time permits we will then work on further improving the glove tracking.

Arthur’s Status Report for 04/24/21

This week I worked on the following:

  • haptic feedback
    • being able to control 3 different microcontrollers with multiplexer
    • receiving input from FLORA via bluetooth module
      • so that the game is able to communicate with the glove
    • based on the input above (game state), different vibrations
      • fruit being cut: buzz 1 (effect 47)
      • bomb being cut: long buzz (effect 118)
      • fruit being missed (lose 1 life): long double sharp click strong (effect 37)
      • more vibrations to be added as needed

still on schedule

next week

  • instead of just one vibration per game state, maybe have a sequence of vibrations per game state
    • this is so that the vibrations are felt longer by the player
  • help Ishaan with Fruit Ninja game

Logan’s Status Report 4/24/2021

This past week I made incremental improvements to our vision positional tracking system. After progress began to level off, I reached out to Ishaan to see where I might be able to help him on the game side of the project. I have begun writing the Unity code to invoke a script that will send the signal for haptic feedback over Bluetooth. The coming week I will continue to work on this as well as some of the 3D assets and code for the game.

We are on schedule.

This coming week I will be working closely with Ishaan to help with the Unity game design portion of the project, as well as the integration of haptic feedback.

Arthur’s Status Report for 04/10/21

This week I worked on the following:

  • finished soldering the parts on the glove
    • FLORA arduino
    • Arduino Bluetooth Module
    • Breadboard
    • Mux
    • 3 motor controllers
    • 3 vibrating mini motor discs
    • Infrared LED & 330 Ohm Resistor
  • haptic feedback arduino scripts
    • the arduino is able to control each of the 3 motor controllers on the glove through the Multiplexer
    • just using 1 type of vibration for now

Still on schedule

Next week:

  • demo
  • start adding more complexity for the haptic feedback
    • different vibrations
    • based on Fruit Ninja game state
      • this requires connecting the RaspPi to the Arduino via bluetooth
      • so that Unity can send data to the Pi, which then transmits it to the arduino

Logan’s Status Report 4/10

Our team has been meeting in Hammerschlag very frequently. Previously we were working synchronously on the same aspects of the system together. To improve efficiency we decided to work in parallel on separate aspects of the system, which has worked well. I focused on the motion tracking all week, and was able to write a script that will be sufficient for our demo next week. I also worked on the serial communication between the Raspberry Pi and the PC. I ran into a number of problems in doing so, but later realized the voltage of my laptop’s USB port was greater than what the Raspberry Pi expected, and after using a different computer it worked correctly.

We caught up significantly this week, and are roughly back on schedule. The remainder of the semester will be dedicated to further improving our MVP system.

Coming up this next week we are meeting again before the demo to finalize our system and test it. We do not plan to meet during Carnival.

Team Status Report 4/10

Significant Risks

  • Motion tracking is functional but needs smoothing. Sometimes there is a jittering effect when holding the glove still.
  • We solved the previous risk of IR LEDs needing a certain orientation to be detected by reducing the resistance of the resistor, and now they are bright from all angles.
  • From the very beginning of the project we decided to focus mostly on the glove, and the AR aspect was intentionally left as a ‘nice to have’. We are able to plug a webcam into the laptop and show the game with a background of the player’s living room for example, but it is not very intuitive. It would be better if the game was being displayed on true AR goggles, but this was not in our original scope so it is unlikely that we will attempt to add that.

Changes Made

  • We actually changed back to infrared LED tracking with a Gaussian blur to differentiate it from other bright objects in the room. Currently motion tracking works very well in low-light rooms.
  • We are back to our original plan of the game running on the laptop. The player simply then plugs in the Raspberry Pi system over USB and the data is transmit via a serial connection, as outlined in our original design.

Schedule

We are back on schedule. Our MVP is nearing completion for the demo this week. For the remainder of the semester we will be using the pre-planned slack time to further improve our system. This includes work on the tracking algorithms to work better in rooms with other light sources, improvements to the game interface itself, as well as integrating IMU data into our motion tracking to develop a predictive algorithm.

Ishaan’s Status Report 4/10/2021

This week I completed the following:

  • Fruit Ninja Unity Tasks:
    • Add Blade to create collisions, follow the mouse
    • Add Fruits – Watermelon and mango
    • Added collision detection to check when fruits are cut
    • Added Physics motion for the fruits – they follow projectile motion
    • Wrote code to fetch input from webcam and display webcam footage on the game. This is the game background and creates an AR effect
    • Created a fruitspawner – Randomly spawns a fruit every .1s – 1s
    • Created bombs and a bombspawner. The bomb is spawned every 2-5 s

Progress Status – on schedule

Tasks for next week:

  • Add new fruits – pineapple, apples
  • Add lives counter, Affected by the bombs
  • Improve the mouse 3d object. Make it seems like a 3D sword on game

 

Arthur’s Status Report for 04/03/21

We met several times this week on campus in HH 1307.  We unfortunately hit an obstacle in our positional tracking of the glove. We discovered that doing background substitution in OpenCV with IR LEDs adds way too much latency. We saw that moving an IR LED across the screen with gray-scaling and background substitution has a 1 second delay on screen.

We’ve almost completed soldering the parts for the glove (FLORA, mux, microcontrollers, battery holder). Still need to solder the bluetooth module and IMU (not arrived yet).

Still on schedule for the demo

Next week, we will finish soldering all the parts on the glove, finalize the positional tracking, and write the haptic feedback Arduino code.

Team Status Report for 04/03/21

Significant Risks

  • Background substitution in OpenCV adds a lot of latency and delay in the movement seen on screen.
  • Orientation of IR LEDs seem to be a problem. The LEDs apparently need to be pointed directly at the camera and this is unrealistic considering the game consists of swiping across the camera (we can’t expect the player to keep the LED pointed at the camera perfectly).

Changes Made

  • Use regular LEDs since the latency is much better (no need to do gray-scaling and use the IR LEDs as white LEDs if we can simply use color LEDs). The intensity of colored LEDs are also stronger so the camera will be able to pick it up at larger distances
  • We are considering running the whole Fruit Ninja game on the RaspberryPi but we have not yet finalized this

Schedule

A little behind on schedule, but will be caught up by the time we arrive at the demo since we plan to have a basic game and the glove fully built.