Projection warp may not work for demo environment. We need to make sure to complete calibration before the actual demo time.
Additionally, we found that the projection calibration slightly changed once we deployed the calibration code on flutter. We did not account for the fact that Flutter does transformations differently than python/numpy which is why much of this week was spent on adjusting the homography to work on Flutter. Regardless, we were able to make the necessary changes and complete integration.
No changes made to system.
No schedule changes.
Unit Tests:
- Button Press / Swipe Test
- tested how well and how long it is taking for system to recognize and execute button gestures.
- Found that the timing is reasonable and the accuracy is about 90% ( out of 10 trials, accurately recognized the gesture 9 times)
- Realized that video frame needs to be cropped/zoomed into button region to detect gestures. Otherwise the hand is too small for gestures to be recognized
- Voice command Tests
- tested how well and how long it is taking for system to recognize and execute voice commands
- found that volume and pitch of one’s voice impacts recognition. The full system had difficultly recognizing commands from Caroline’s voice but was easily able to recognize Tahaseen’s voice commands when she spoke loudly and with a low pitch.
- Recipe execution tests
- we tested how the cooking experience is impacting the overall system. For example, when boiling water, we wanted to see if the steam was blocking the camera view.
- Found that the steam doesn’t greatly impact the view as the burner is located to the side of the camera