I did not get to work as much on capstone this week due to exams and other deadlines, but I did manage to get some things done:
1) Testing of Clutter Detection
I had taken a bunch of images last week, so this week I spent running them over my algorithm and checking the accuracy. I found out several things this week,
– I was detecting clutter when the reflection in the stainless appliances was changing
– I needed to use different clean images based on the different lighting conditions of the counter
– If humans came into the scene, everything would just go haywire
– The counter is made of marble so it reflects light differently sometimes and can hence also cause clutter. Since I can’t do anything about this case, this has to be the reason why we have the error threshold.
2) Integrating Testing Server into actual Server
I forwarded the server I made to Jeffrey to integrate it into the actual hub.
3) Coming Up with Solutions for the issues in Clutter Detection
To resolve the issues that came up while testing clutter detection I decided to add the following to the code.
– The ability to select from several clean images, which one should be used as the base. This is in a very basic and rudimentary form.
– Detect humans based on if there is a very large contour, then ignore the current input image. Detecting humans can be done via searching for the biggest contour in the image as the only time they should interfere is when they are in between the clutter zone and the camera
– Increasing the ignore zone to a curved line, so that it can ignore the appliances that are on the edge of the zone. Since the user wants them there.