Aneek’s Status Report for 4/25

This week we have been working together on the final features of our project and the final presentation slides as well. The finger detection portion is largely complete, so I have been working on UI features and the presentation slides.

Aneek’s Status Update for 4/18

This week I worked on smoothing out the “loading” animation by averaging the tracked points to minimize the natural jitter of the user’s finger. I also worked on the outward-facing API to embed into Connor’s UI. We also worked together to integrate the individual pieces together for our demo.

Aneek’s Status Update for 4/11

This week I worked 2 goals: creating a tap & tap loading animation in OpenCV and making the API for the finger detection more friendly for Connor’s UI. The first goal has gone smoothly, although there is some jitter in the animation due to small fluctuations in the recognized fingertip position. I will be working to improve this in the following week and have a couple ideas – taking a set point to base the animation around at the start and ignoring shifts, or taking a running weighted average to minimize jitter but maintain more animation-accuracy. In regard to the second goal, I spent a lot of time reading about both OOP in Python, as I have not explicitly used Python to create and interact with user-defined classes before. I have also been doing research into how the webcam feed + OpenCV overlays can be integrated into a PyGame interface to inform my decisions about the API. Mainly, I broke what was previously a main-function loop to demonstrate and visualize the performance of the finger detection into a few different functions that represent individual “transactions” with the system (request/response for tap, initialization/calibration). This coming week I will continue to learn about OOP patterns in Python and refactor the code to create and access the module as an object within a orchestrator back-end.

Aneek’s Status Update for 4/4

  • This week I was focused on getting finger detection in OpenCV working. The issue of various phantom ‘fingers’ being detected in the background was mitigated by using a plain, consistent background for the images (I tested on a white table and a black blanket). While that demonstrated that the approach I was using was working, the background will not be so consistent when puzzle pieces are laid out on the surface below the hand, so my next steps were to test out various methods of improving the detection on more complex backgrounds. The approach that seems the most promising is using a color mask to isolate the hand before running the edge detection to determine the fingertip. However, the color of the hands surface is drastically different depending on skin tone and lighting conditions, so my next steps to improve this are to continue research on alternative methods and/or ways to broaden the color mask to function across a broad spectrum of hands.

Aneek’s Status Update for 3/21

This week our project changed significantly due to the COVID-related moves to remote instruction. Unfortunately, the main work I’ve done so far relating to the finger & tap detection is no longer a part of our final project, which is moving to a completely digital project and containing more significant design trade studies on various CV algorithms. This week Andrew, Connor, and I mostly worked on how to change our project to work under these new conditions and the exact mechanics of what this new, digital puzzle solver will look like to the user.

Aneek’s Status Update for 3/15

We focused on completing our design report and making many small but important decisions that came up over the course of writing it, which has further defined our project more technically. I mainly worked on my sections for the design report, as well as formatting and assembling the complete document after my teammates finished their sections, which took a surprisingly long time! MS Word formatting isn’t particularly intuitive. I also started tinkering around with OpenCV so I can assist Andrew with the CV portions of our project. The Hand/Tap detection code is done but hasn’t been tested with the Duvetyne fabric yet, so when that comes in, I will switch gears back to that to make sure that section is complete.

My plan for next week will be dependent on what our team and our faculty and TA mentors come up with to accommodate the move to remote instruction.

Aneek’s Status Update for 2/29

This week I gave the design review presentation for our team, so I mainly focused on practicing for that and refining the slides beforehand. I also thought more about the software architecture and created a full diagram of the library structure as well as the callable functions between them, so that we can divide up the tasks very easily into different components.

Post-presentation, I reviewed the feedback on our design and continued to build out the Tap detection software. The socket programming is almost complete, and the tap detection software is pretty much complete for testing – once the Duvetyne mat arrives, I will jump right into that and see what fine-tuning we can do to improve detection and lower latency.

Aneek’s Status Update for 2/22

This week I mainly focused on testing the Leap Motion controller in various conditions and exploring the SDKs. It turns out that the company behind the controller has pivoted since I last used the controller and the new SDKs are focused on VR game development purposes, so I installed the original library (last released in 2017) to see if it would still work. It requires Python 2.7, but I was able to get a simple hand and gesture tracker up and running to explore the characteristics of its tracking. I measured that it has an effective range of about 22″ when facing upward, but we found some issues when we flipped it over to track hands while facing downwards. Some more research revealed that the tracking is done through IR LEDs and cameras, so when the controller was oriented downward, the reflection of the IR light off the surface below the hands threw off the software and led to issues. I found a tool from Ultraleap (the company behind the controller) that allowed me to see the actual image view of what the cameras were picking up, and was able to confirm that it was the reflection of the IR light washing out the image that was likely causing this. I worked with Connor to do some research on materials that would reflect less IR light and found a few options, including Duvetyne and Acktar Light Absorbent Foil. Duvetyne was the most affordable one and still promised significant IR absorption, so we are planning on purchasing a sheet of it to lay underneath the frame to assist the downward-facing Leap Motion’s hand-tracking.

I also worked on the system/software architecture, and made some diagrams for the design review slides. Because I didn’t want the old Leap Motion SDK to make all 3 of us have to write in Python 2.7, I came up with a simple client-server model communication protocol so that the other parts of the codebase can be in Python 3 and use sockets to communicate with the hand-tracking code. Since we have strict latency requirements, it is important that this added latency of using sockets, even when it’s completely local, isn’t too high, so I will continue to measure and be thinking about this and minimizing the amount of data that needs to be communicated.

Team Status Update for 2/15

This week, we took some time to do our assigned reading and write-ups on The Pentium Chronicles. We also drafted a more complete system architecture and dove into the software architecture as well. We continued to research camera options and started working with OpenCV to do some performance tests.

Aneek’s Status Update for 2/15

This week I mainly worked on two things. First, I started reading the documentation on the Leap Motion SDK and its gesture detection. We are planning on using the controller in an inverted set up (facing down), and I discovered that it may cause some challenges in detecting hand orientation. Fortunately, that shouldn’t interfere with finger and tap detection but we will need to run some tests once the device is in-hand. I also drafted our system and software architecture and started breaking down the software into individual libraries that we can then divide up and work on. I also recorded our ideas on the initialization and the responsive animations.