Tanushree Status Report 2/23

This week I continued to work on the Determinant of Hessian (DoH) blob detection algorithm. After talking to Prof. Savvides, I realized that I could fix the scale of the blur level, i.e, fix the sigma of the gaussian kernel used to blur the image. Thus, the algorithm I have implemented can be considered more of a “circle” detector rather than a true blob detection algorithm. Regardless, it suits our purposes.

As has been mentioned in earlier posts, Isha and I are using the same set of images to test our algorithms. These test images were taken in an environment that closely represents our demo setup. Below are the results:

Currently, my algorithm works for some images (test 12, 14) and not for others (test 11, 13). Following this, one of the design decisions we have made is to color the buttons of our GUI white and keep the background darker. This seems to be a sound decision given the fact that we will only want to detect the user’s finger when he/she taps on a button.

Apart from this, I also helped my team setup the camera and projector in order to obtain the above-mentioned test images. Further, we all worked on the creating the slides for the design review.

Next week I will continue to test my algorithm on different images to see where it breaks and tune parameters accordingly.  I will also start working on K-means clustering that will make use of both color detection and blob detection. We hope that this clustering algorithm will provide us with robust results.

I am currently on schedule.

Suann Update 2/23

This week I worked on setting up the website and writing the project introduction post in accordance with the guidelines. It took me some time to figure out website settings and get the hang of using WordPress as I have never used it before. We also worked on writing design presentation slides together.

Unfortunately I do not have much progress on Lucas-Kanade this week since I had an on-site interview. I’m behind now, but fortunately I have much more free time this upcoming week so I will be able to dedicate a lot of time to Lucas-Kanade. Since the end of next week is my deadline for completing Lucas-Kanade, if I do not get it by then, I will move on to setting up the Raspberry Pi since the piezo circuit is a more crucial part of the project.

I will get Lucas-Kanade working by next week.

Isha Status Report 2-23

I helped my team set up our camera and projector and compile a bank of test images. We also worked on our presentation slides for the design review. Since I will be presenting, I have been writing a script for the presentation.

I have also been experimenting with LAB and HSV color space results. Right now, I am using simple thresholding with the new test data we took this week. Below is a table of results I have compiled. The red dot in the HSV and LAB images indicates the detected coordinate of the red dot. The HSV space result for image Test11 is significantly off the desired coordinate. I will begin to work more with the camera and test both color spaces with real time data. Next week, I hope to make a data driven decision to determine which color space would be best suited for our purposes. My progress is on schedule.

Team Status Report 2/23

Significant Risks:

Tanushree’s laptop’s hard drive had to replaced. This problem added a delay to our schedule despite her using cluster computers. Since her work mainly used Matlab, she was able to still work on cluster computers.

Our projector stand was not tall and stable enough to project on the lab bench. So to mitigate this risk, we changed the surface we project on to a poster board. We will now be creating a GUI for our final interactive poster.

Changes to Existing Design of the System:

We have made changes to our individual task assignments. Now, Suann is solely working on implementing Lucas Kanade, while Tanushree is focusing on blob detection. Further, Isha will be exploring different color spaces to decide which one best fits our purposes.

We received our parts much before the expected date. So we have started assembling our demo setup. This has allowed us to create a standardized bank of test images taken in our potential demo environment. Isha and Tanushree are both using these images as input for testing their respective algorithms.  Another advantage of having a primitive setup is that we have decided the color scheme of our GUI such that it suits our algorithms and prevents occlusions.

Updated Schedule

 

Introduction and Project Summary

Intro to InteracTable

Have you ever been frustrated by having to look over someone’s shoulder during a group meeting because your laptop died? Unhappy with Google Drive’s Terms of Service? Wanted more space to work on? Then InteracTable is the right device for you!

By using the InteracTable, you will be able to turn any table top into your own laptop screen. You will be able to resize your working surface area, and best of all, you will be able to collaborate with teammates on any table.

In order to create the InteracTable, we will connect a projector to a user’s laptop, and the laptop screen will be projected on to a flat table top. We plan to use computer vision (coded using MATLAB) to track a red dot that we place on a user’s finger. In order to capture this information we will use a small webcam to send information back to the laptop for processing. To confirm user tap, we plan to use a piezo sensor connected to a Raspberry Pi. For this proof of concept prototype, we will constrain the system to work with only one finger.

The InteracTable will be a competitive product because unlike a capacitive touch screen table, it will be low-cost and portable.  While this is a proof of concept, we hope to revolutionize the way people collaborate by creating the InteracTable.

 Project Proposal Slides