Edward’s Status Report for April 30

After I submitted last week’s status update, me and Joanne worked on fixing the bugs in the Webapp. We spent several hours cleaning up the code and trying to see what was causing the issues. We have been seeing that the model freezes up or moves in the opposite direction when the wearable sends is finger locations. After debugging, we finally found the issue and fixed it up. This video was made right after we fixed all the bugs!

This week, I prepared for and worked on the final presentation. Anushka was feeling better, so I gave her the wearable for her to properly sew on. So, without any hardware, I spent most of this week working on the final presentation.

I am on schedule and pretty much done with the individual tasks I needed to get done this semester! Next week, I will help on polishing everything for the final demo. We still need to build the hologram pyramid, so I will be helping Anushka on that. Super excited to demo our system in the final demo!

Edward’s Status Report for April 23

This week, I worked on the final project presentation as well as further improving the gesture to 3d model action algorithm. Anushka has been sick this week, so Joanne and I have been trying to finish up everything before the demo. I have been working on collecting more training data to retrain our SVM whenever an issue comes up. I think the current model is able to distinguish between one and two fingers pretty well. Some of the tricks I played to get the robustness up are: removing the noise label in the model (since it can get swipes and noise confused, and noise is easy to detect), and training on data where I just place my fingers and not move them or perform gestures.

I started collecting data for the testing section of our presentation. We discussed as a group how to determine our system’s accuracy, gesture precision, and latency.

I also worked with Joanne to cut the acrylic for our hologram, but the glue that techspark had looked really bad and tape would show up when under light, so we redesigned our hologram to use slots instead of adhesives. I am not the best at cutting, so I hope Anushka will get better so she can cut and help construct the hologram.

The gesture algorithm has proven to be much harder than we though, especially with just the two of us this week. Hopefully, next week we can have something demo-able. We are using up our slack weeks.

Edward’s Status Report for April 16

This week, I continued working on the finger and gesture detection and further integrated the wearable with the Webapp. Our data is extremely noisy, so I have been working on smoothing out the noise.

Originally, the Photon is unable to send the timestamp when it collected all 10 sensor values, since the RTC on the Photon did not have millisecond accuracy. But, I found a way to get an approximate time of sensor data collection by summing up elapsed time in milliseconds. This made the finger detection data sent to the Webapp appear smoother.

I am not super confident in our SVM implementation of finger detection, so I want to remove pinches entirely in order to reduce noise. Two fingers that are close together often get confused with one finger and vice-versa. Because of this fact, many swipes get classified as pinches, which throws off everything. So, I decided to work on an implementation without pinches and it seemed to work much better. As a group, we are deciding whether or not this is the direction to go. If we are to continue with only detecting swipes, we will need to have a full 3D rotation rather than only rotating along the y-axis.

This week, we also attached the board to a wristband we ordered and have been working off of that. The sensors need to be elevated slightly and I am not sure how annoying that will be to a user. This elevation also makes it so that users cane just swipe on their skin casually; they need to deliberately place their finger in a way so that the sensors can catch it.

This week, I will work on trying to filter and reduce noise in our swipes-only implementation, which I think we may stick with for the final project. This will be a tough effort.

I am a bit behind schedule as the project demo comes closer. Detected finger locations are a bit shaky and I’m not sure if I can figure out a way to clean them up. I will work with Joanne to come up with algorithms to reduce noise, but I’m not too confident we can make a really good one. Removing pinches limits the functionality of our device and I’m not sure if its even “impressive” with just swipes. We will need to either explore more gestures or broaden our use case to make up for this fact.

Edward’s Status Report for April 10

This week, we demo’ed our prototype to the instructor and TA. There was a small soldering mistake that I fixed before our demo, but our hardware should reliably work from now on. An issue we found in our setup is the connectivity. WiFi in the lab is pretty spotty and sometimes there will be pauses when data is being streamed. We hope this will not happen much in the UC when we demo, but I am not sure this will go away. I plan to talk to Tamal about this in the coming week.

Other than the demo, we also worked on hashing out a basic plan for the gesture detection and did further integration with Joanne’s Webapp.

We did not do much coding this week due to Carnival, but next week I will help Anushka code up our new gesture detection algorithm. We plan to have an new approach where if a new gesture is detected for n times, then we say that the current gesture is that new gesture.

We also decided to ditch the Jetson Nano due to poor WiFi performance, but Tamal said this should not be an issue, so I may chat with him about this in the coming week.

Possibly tomorrow, we also plan to attach our hardware to a wristband we bought. This will be done using tape or sewing.

We are on track if we work on and refine the gesture detection for the next two weeks. This is the toughest part of this project and it needs to work well for the project to function.

Edward’s Status Report for March 26

This week, I worked on gesture recognition algorithms with Anushka and helped integrate everything with Joanne.

Our algorithm fits a curve to the 10 sensor data points and finds the relative min and max values of that curve. If there exists a max, we say the data represents a pinch. However, I am not too happy with the accuracy of this. I think we need to do something where we look at data over time, but I was having trouble figuring out what to do. Next week, I will work on this more and possibly explore other techniques.

I also wrote a script to test what we have now with Joanne. In real-time, we are able to collect data from the wearable, process it, and send it to our webapp. The webapp shows a 3D model which can be rotated as we swipe. This demo works fairly well, but there are some kinks to work out. Pinches are ignored in this demo, and swipe detection works pretty well, honestly.

I am on track this week, but I am worried we will not get a good algorithm in place and will fall behind one day. Further integration will take some time too. Next week, I will work on our deliverable, which will probably be a refined version of our swiping-to-rotate demo. The major issues for us looking forward are differentiating between a swipe and a pinch.

Edward’s Status Report for March 19

One of the largest risk factors we had were the PCB’s not working. Before Spring Break, both our PCB’s (sensor board and support components) came in. I mocked up a rough prototype on a breadboard with both PCB’s but it didn’t seem to work. The I2C communication on the sensors was very wonky so the Photon wasn’t always able to start up all the sensors. So, after tinkering for a very long time, I eventually removed the support components PCB and ended up manually wiring up the sensors PCB only (see diagram below).
This actually ended up working! So, apparently I didn’t even need the support components board… But, the good new is, our hardware works now! The wearable is coming along well.

After writing come code to control and read from all 10 sensors as a test, I soldered the PCB and the resistors to a perfboard (see diagram below). I also added a battery.

(Eventually the Photon will go behind the resistors closer to where the blue wires are, making the form factor smaller. I will also get rid of the yellow breadboard and solder the device directly to the perfboard.)

I wrote code to stream the sensor data over MQTT and modified our visualization program to view the data in real time. I also recorded some data to be used to test out finger detection and gesture recognition algorithms.

By some stroke of luck, I am ahead of schedule, as I anticipated to be debugging and redoing the PCB’s. My next steps are to help Anushka on finger detection and gesture recognition on the Jetson.

Edward’s Status Report for February 19

This week, I finalized the PCB’s (should come late next week or early the week after). Turns out I was generating gerber files that did not include actual drill holes… I also removed parts that were not available in the board house with two through holes, so we can hand solder the missing parts. We may need to buy a 0.1uF capacitor and solder that on, which shouldn’t be too bad. I’m a bit scared that the support components PCB will end up not working on the first try, but we can always iterate.


Sensor array PCB rev. 001


“Support” PCB rev. 001

I also bought some VL6180X sensor breakout boards from Amazon. These boards are the boards I based my PCB design off of.  I played around with the sensors in Arduino with a Particle Photon as a preliminary test. I couldn’t get the IDE to recognize my ESP32. The Photon has WiFi capabilities, but not BLE, but that is probably ok. The sensors seem to be fairly sensitive and have a good range. Each sensor maybe has a ~1-2mm margin of error in relation to the other sensors, which isn’t too bad. I tested this by placing a piece of paper in front of three sensors hooked up to my Photon and recorded the readings. I implemented sensor reading in a round robin fashion due to IR interference, since each sensor records readings within a 25º (+/- 5°) cone (according to datasheet) and other sensors can interfere when one sensors is reading. I also added a small averaging filter to the readings.


Sensor testbed

I am on schedule for this week and will work on communication stuff next week with the rest of the team.

Edward’s Status Report for February 12

In order for our idea to work, we need an array of distance sensors. We need those sensors to be tightly packed and there does not exist an off the self solution for that. So, we decided to make our own custom PCB breakout board. We decided on the VL6180X proximity sensor by STM, since it can sense up to 5mm to ~150mm, which is about right for the length of a typical forearm. So, I sought out to make a PCB containing an array of these sensors. I built our PCB design off of this, since it provided an open source Eagle schematic. I figured building off an existing, working design will hopefully ensure that our design will be functional and make it so that we will have to iterate less.
So, this week, I worked on finalizing the first iteration of our PCBs on Eagle. I realized we needed to make two PCBs since I could not fit all of the SMT components unto a single one without making it too large. So, we will have two PCBs on our wearable device: one containing an array of VL6180X sensors and another containing all the “support” components like voltage regulators, resistors, etc.
I prepared them to order on JLCPCB and generated the BOM and CPL files that were needed.  I will need to talk to a TA or a professor to verify that my design will work.


Sensor Array PCB


“Support Components” PCB

I also helped lasercut a rough prototype of our hologram pyramid with Anushka and Joanne.