I worked this week on the following areas:

  • Finishing CV verification testing
  • Performing user validation testing
  • Improving the CV subsystem
  • Designing and preparing to create a hardware enclosure

As part of finishing CV verification testing, I performed contact detection tests. The point of these tests is to determine the accuracy of predicting the correct object currently held when an object is actually held. I performed 3 trials of grasping and releasing each of the objects in our chosen set of household objects. The accuracy corresponds to the number of frames in which the correct object was predicted divided by the total number of frames during which contact was detected by the force sensor. I performed an additional trial where I rotated which objects I picked up. Below is a summary of the results (figure 1). It is important to note that individually, scissors passes our goal accuracy rate of 90%, while the mayo jar and coffee cup slightly fall short. Testing showed that this is due to the fact that these two objects sometimes get confused for one another during object detection, a bug that wasn’t picked up on during object detection testing (possibly due to the more controlled environment). To fix this before the demo, I may try to swap one of these objects for another object in the 91 class SSD model we are using. Possible household objects I will try out include “sports ball, hair drier, fruit”.

Test Accuracy
Mayo Jar 85%
Coffee Cup 89%
Scissors 96%
All 100%

Figure 1. Contact detection accuracy trials

For user validation testing, Harry, Kevin and I spent a few hours on Thursday defining the logistics for the test. Then on Friday we performed the tests with two music technology students. Overall, we got positive feedback on the design and responsiveness of our MIDI device. We also got critical feedback on which areas of the system we could improve.

One of these areas for improvement included objects being erroneously tracked while already holding a tracked object. To solve this issue, I implemented a fix in the CV subsystem that would stop new object detection while the force sensor is activated.

I also worked with Kevin to design an enclosure (essentially a tray) for our Jetson, Due, and breadboard to keep the overall system organized. We initially planned to 3D print this tray, but after receiving feedback from TechSpark faculty, we ultimately decided to go with laser cutting and gluing together two pieces of acrylic. The advantages of this approach are greater speed and lower cost. Kevin spent time drawing a vector file for the cut, while I got trained on using the laser cutter. Tomorrow I will meet with the faculty member and perform the laser cutting.

Overall, we are in good shape and I am on schedule. Tomorrow we will perform one final round of validation testing following some of the implemented changes. Next week I will also help by putting together the poster, demo video, and demo for our project.

Tomas’s Status Report for 4/30/22

Leave a Reply

Your email address will not be published. Required fields are marked *