April 6th Status Report — Surya Chandramouleeswaran

This was a busy yet exciting week for our group in preparation for the interim demonstration. I continued work on the hardware and classification components of our project with the goal of shifting to integration in the coming weeks.

Operating the RPI in a “headless manner” continued to offer its own challenges. Interfacing directly with the board required an SSH connection and a remote viewer (typically RealVNC), which would be quite slow at times. As such, observing the camera performance through the SSH connection resulted in significant frame lag and limited resolution. My goal for the coming weeks is to improve our camera performance through a headed monitor setup and try a USB camera as opposed to a 3rd party vendor (Arducam).

Dependencies associated with classification hardware:

Resulting camera’s first image!:

 

To elaborate, the plan would be to automatically “capture” an image once the algorithm is 80 percent (or higher) confident that it has classified the object correctly. The formal classification operates on the backend, but an 80 percent benchmark from this rudimentary algorithm I’ve implemented on the RPI typically indicates the image is of sufficient quality, so it is a good heuristic for image capturing. The resulting image then needs to be sent to the database. Once our web application is hosted, I’ll add the IP addresses of both RPIs to the database to allow it to accept images from our RPIs. The user will then have the option to reject or accept the image if it is not of sufficient quality. I will implement these steps as soon as the new cameras come in (probably next week).

Verification in the context of hardware consists of an evaluation of classification accuracy and latency considerations. There are inherent design tradeoffs between algorithm complexity, hardware limitations, and the overall usage time scale we want the product to be operated within.

This also entails evaluating system performance under various environmental conditions. Namely, we plan to conduct tests under different lighting conditions, angles, and distances to understand whether the algorithm maintains consistent accuracy and latency across different scenarios.

We are looking for an 80 percent confidence metric on the hardware side given that it just needs to take a clear enough picture and forward it to the database. So verification will entail checking that the classification accuracy meets or exceeds this threshold while maintaining respectable latency. Finally, it is important to balance quantitative verification with qualitative verification, so we will rehash these ideas once more with Prof. Savvides and Neha (our assigned TA) to build a thorough verification of system performance on the hardware end.

Our progress is coming in well on all fronts and I am excited to continue improving the performance of our design.

Leave a Reply

Your email address will not be published. Required fields are marked *