Isha Status Report 4/14

We finished integrating our third subsystem last Sunday. So this week we worked as a group to identify and fix several bugs with the system. The system works, but it is slow due to various inefficiencies. Response time after the initial integration was on average several seconds.

Our progress is on schedule.

Next Steps:
Continue working as a group to fix bugs and optimize performance

Tanushree Status Report 4/14

Now that we have integrated all our subsystems, all our remaining will be used in debugging the functionality errors and refining the user interface and experience. Thus, going forward, we will essentially be working as a group rather than assigning individual tasks, unless necessary. This week we met on Thursday to resolve a couple of known bugs but also identified new ones.

My progress is on schedule. As mentioned, all the remaining time will now be used for debugging and refinement.

Suann Status Report 4/6/19

The Arduino code was tied together this week. The tap subsystem code was tested to see if it worked as expected; the subsystem was able to send user tap data accurately and quickly every time a tap was sensed. However, tap threshold varied greatly depending on the poster board setup. The knock threshold was higher when the poster board was placed in its stand and assembled at full height. The threshold was lower when the same stand setup was only partially assembled and placed on a table like so:

I will work on thresholding in the setting of the final demo for best tap thresholding. I will also help verify our product requirements.

Tanushree Status Report 4/6

Isha and I tested our integrated subsystems last Sunday with real-time input. The integrated system works with a certain margin of error. Isha has implemented a second iteration of the calibration function to reduce these errors.

During Wednesday’s lab, all three of us tested the tap detection subsystem to calculate an appropriate threshold value. We conducted this test in three different environments and eventually settled on the most optimal one which we hope to use for our final demo.

I also worked on integrating Suann’s tap detection code with the rest of our system. Thus, now we have all three subsystems: GUI, color detection and tap detection, integrated. My future tasks include:

1. testing this integration with real-time input

2. refining the GUI to make it more appealing to the user

I am currently on schedule.

Isha Status Report 4/6

We decided to do our real time testing with the second classifier, which takes about 0.8 seconds on average to finish computations. I tried a few different methods of classification: svm.LinearSVC and the SGD classifier since they are supposed to scale better. However, the results of our original classifier remain significantly better so we decided that these methods are not worth pursuing further.

Tanu and I finished debugging our detection and gui integration on 3/31. Our system now works in real time. We added buttons on the screens as a placeholder for a tap to trigger color detection.

The first iteration of the calibration system that I wrote last week is working. For this system, we detect the coordinate of the top left and lower right corner of the gui screen using our existing color detection. After repeatedly testing in real time, we realized that since a user has to position the red dot at the corner of the screen for calibration, there is significant variability in detected coordinates. Thus the mapping of coordinates from detection to gui is not accurate. Each mapped coordinate is consistently a fixed distance away from the target coordinate.

To adjust for this error, I wrote a second iteration of calibration. The second version of calibration consists of a splash screen that appears on loading the program as shown below. A third coordinate is detected in the center of the screen to adjust for differences in the detected coordinates of the two corners of the gui.

Next steps:

  1. Test the second iteration of calibration function in real time. If this function does not give good results, possibly change it to project and detect red squares at the corners of the screen so that they will always be at fixed positions.
  2. Compile detection results from the 16 test images we took on Wednesday in a table

I am on schedule.

Team Status Report for 4/6/19

The most significant risk is the error rate in the calibration function. We are currently trying to mitigate this through different approaches like re-adjusting the reference coordinates using a third central coordinate to recalculate the width and height. Additionally the thresholding for tap detection seems pretty arbitrary from the data we collected right now. Testing on final demo site is necessary.

We have added an additional step to the calibration function: detecting a third coordinate at the center of the GUI screen for re-adjustment of error in calibration. To optimally implement this change, we have added a splash screen that it solely meant for calibration. After calibration is complete, the user is directed to the main screen consisting of the six buttons required for our poster. 

No schedule changes have occurred this week.