Edward’s Status Report for March 26

This week, I worked on gesture recognition algorithms with Anushka and helped integrate everything with Joanne.

Our algorithm fits a curve to the 10 sensor data points and finds the relative min and max values of that curve. If there exists a max, we say the data represents a pinch. However, I am not too happy with the accuracy of this. I think we need to do something where we look at data over time, but I was having trouble figuring out what to do. Next week, I will work on this more and possibly explore other techniques.

I also wrote a script to test what we have now with Joanne. In real-time, we are able to collect data from the wearable, process it, and send it to our webapp. The webapp shows a 3D model which can be rotated as we swipe. This demo works fairly well, but there are some kinks to work out. Pinches are ignored in this demo, and swipe detection works pretty well, honestly.

I am on track this week, but I am worried we will not get a good algorithm in place and will fall behind one day. Further integration will take some time too. Next week, I will work on our deliverable, which will probably be a refined version of our swiping-to-rotate demo. The major issues for us looking forward are differentiating between a swipe and a pinch.

Team Status Report for 3.19.2022

The hardware for the wearable is completed and a testbed for the physical sensor configuration and MQTT sensor data streaming have been fully set up! The device reads from all 10 sensors in a round-robin fashion (to prevent IR interference), applies a EWMA filter to the sensor readings, and sends the data to MQTT to be received by a Python program or a website. The device can also send power information like battery percentage and voltage. After this was finished, we moved on to thinking about the webapp and gesture recognition while Edward works on polishing up the hardware.

For gesture recognition, we made a lot of progress since receiving the sensors. We began by just observing the general behavior of the sensors and brainstorming ideas on how we can map them to different gestures. We noticed a few mathematical characteristics of the zooming gestures that were different from the characteristics of a swipe, so we will most likely use those as our gesture identification. We have a few theories as to how we can distinguish those further, and those will be our next step.

For the Unity-Web Application part of our project, we started working on communication of data. In order to prepare for integration, we made sure that data could be streamed to our web application via the MQTT protocol. For now, we just sent over dummy data from a python script via the protocol above. Then we made sure that we can actually communicate smoothly with the Unity interface that is embedded onto the Django web app. Now we are continuing to work on how to translate the gesture data into more precise gestures applied to the model.

The next steps are to build out the actual wearable (buy wrist straps, create enclosure for hardware, attach hardware to strap, etc.). We also need to flesh out gesture recognition and make it more robust to the various kinds of ways to perform specific gestures like a swiping left and right or a pinching in and out.

Our biggest concern still remains to be the Jetson. However, since we know the connection between the device and our computers work, we at least have one avenue for communication. That will most likely be our contingency plan, but we’re still hoping that the Jetson will work for faster computation purposes. Our schedule still remains the same, and we’re planning on beginning integration over the next week.

Edward’s Status Report for March 19

One of the largest risk factors we had were the PCB’s not working. Before Spring Break, both our PCB’s (sensor board and support components) came in. I mocked up a rough prototype on a breadboard with both PCB’s but it didn’t seem to work. The I2C communication on the sensors was very wonky so the Photon wasn’t always able to start up all the sensors. So, after tinkering for a very long time, I eventually removed the support components PCB and ended up manually wiring up the sensors PCB only (see diagram below).
This actually ended up working! So, apparently I didn’t even need the support components board… But, the good new is, our hardware works now! The wearable is coming along well.

After writing come code to control and read from all 10 sensors as a test, I soldered the PCB and the resistors to a perfboard (see diagram below). I also added a battery.

(Eventually the Photon will go behind the resistors closer to where the blue wires are, making the form factor smaller. I will also get rid of the yellow breadboard and solder the device directly to the perfboard.)

I wrote code to stream the sensor data over MQTT and modified our visualization program to view the data in real time. I also recorded some data to be used to test out finger detection and gesture recognition algorithms.

By some stroke of luck, I am ahead of schedule, as I anticipated to be debugging and redoing the PCB’s. My next steps are to help Anushka on finger detection and gesture recognition on the Jetson.

Team Status Report for 2.26.2022

We delivered our design review presentation on Wednesday. Overall, it went pretty well. Creating the presentation gave us a lot of insight into what the next few weeks are going to look like, and we discussed our schedule more in-depth after the presentation to cover ground in areas that we still need to work on.

The PCB board is being manufactured and will be shipped next week, and we’re currently testing individual sensors to see how they behave. Additionally, we decided on using MQTT communication since it was the most common IoT communication used, especially with the Photon Particle. We began testing the sensors and developed a visual interface using JavaScript and HTML to see how the sensors detect objects close by. For the most part, we’re able to tell what sensors are going off when a finger comes close by, and we can also almost guess which direction we are swiping. This is a good indication that our rotation gesture recognition algorithm may actually work.

We also started the web application portion of our project. We have figured out how to embed the Unity application onto our Django web app and communicate data between our Web application and Unity.

Together, we worked on brainstorming how the web interface will look like and how it will be implemented. Joanne will primarily be working on getting a demo of that working soon. Edward will test out the PCBs once they are shipped. Anushka will work on getting the Jetson to boot and run code. We seem to be on schedule this week.

An issue that arouse was that the VL6180X sensor can read data in a 25 degree cone, and when a finger gets too far away from the sensors (75-150mm range), all the sensors will say that they see an object. We have no idea how this will look like exactly, so we created a visualization to see what would happen if a finger gets too far away. Would the finger look like a line? Would it have noticeable dip? We will need to explore this once the PCB is fully done and verified, which might take a while and put us behind schedule.

Edward’s Status Report for February 26

This week, I worked on firmware for our wearable. I wrote code for our microcontroller, which is a WiFi-enabled Particle Photon MCU to connect to a public MQTT network and publish sensor data. I actually spent a lot of my time trying to get it to work on campus WiFi. At first, I was unable to get it to connect reliably to CMU-DEVICE, but, after futzing around with it for a couple hours, I tried out another board and it seemed to work fine.
The Photon publishes data formatted like so, in a packed byte array format:
[<4 byte little endian timestamp>, sensor value 1, sensor value 2, sensor value 3]

I hiccup I ran into is that our board cannot get a unix timestamp with millisecond accuracy, so measuring the latency of our data collection will be incredibly off and inaccurate. Might need to figure out a way around this. However, technically, this is only for measuring data for metrics, and not essential to the actual functionality of the device. So, I may need to consider the tradeoffs of taking time to fix this problem or just ignoring it and getting an inaccurate latency measure.

I also worked with Anushka to get a visualization of the sensor data working so that we can understand what kind of data we are dealing with (see video here). We have a set up with three sensors attached to a Photon that publishes data to MQTT. The webpage gets data from MQTT and displays it. I believe visualization will help guide us in what a “finger” looks like in terms of sensor data.

Additionally, I ordered the PCBs earlier this week and they should come next Tuesday (3/1). Next week, I will be working on testing and verifying that the PCBs work as intended. This will probably take up a large chunk of my time next week.

I am on track this week, but as soon as the PCBs come, I might fall behind, since they might not work… I want to get the PCBs verified before Spring Break starts.

Edward’s Status Report for February 19

This week, I finalized the PCB’s (should come late next week or early the week after). Turns out I was generating gerber files that did not include actual drill holes… I also removed parts that were not available in the board house with two through holes, so we can hand solder the missing parts. We may need to buy a 0.1uF capacitor and solder that on, which shouldn’t be too bad. I’m a bit scared that the support components PCB will end up not working on the first try, but we can always iterate.


Sensor array PCB rev. 001


“Support” PCB rev. 001

I also bought some VL6180X sensor breakout boards from Amazon. These boards are the boards I based my PCB design off of.  I played around with the sensors in Arduino with a Particle Photon as a preliminary test. I couldn’t get the IDE to recognize my ESP32. The Photon has WiFi capabilities, but not BLE, but that is probably ok. The sensors seem to be fairly sensitive and have a good range. Each sensor maybe has a ~1-2mm margin of error in relation to the other sensors, which isn’t too bad. I tested this by placing a piece of paper in front of three sensors hooked up to my Photon and recorded the readings. I implemented sensor reading in a round robin fashion due to IR interference, since each sensor records readings within a 25º (+/- 5°) cone (according to datasheet) and other sensors can interfere when one sensors is reading. I also added a small averaging filter to the readings.


Sensor testbed

I am on schedule for this week and will work on communication stuff next week with the rest of the team.

Edward’s Status Report for February 12

In order for our idea to work, we need an array of distance sensors. We need those sensors to be tightly packed and there does not exist an off the self solution for that. So, we decided to make our own custom PCB breakout board. We decided on the VL6180X proximity sensor by STM, since it can sense up to 5mm to ~150mm, which is about right for the length of a typical forearm. So, I sought out to make a PCB containing an array of these sensors. I built our PCB design off of this, since it provided an open source Eagle schematic. I figured building off an existing, working design will hopefully ensure that our design will be functional and make it so that we will have to iterate less.
So, this week, I worked on finalizing the first iteration of our PCBs on Eagle. I realized we needed to make two PCBs since I could not fit all of the SMT components unto a single one without making it too large. So, we will have two PCBs on our wearable device: one containing an array of VL6180X sensors and another containing all the “support” components like voltage regulators, resistors, etc.
I prepared them to order on JLCPCB and generated the BOM and CPL files that were needed.  I will need to talk to a TA or a professor to verify that my design will work.


Sensor Array PCB


“Support Components” PCB

I also helped lasercut a rough prototype of our hologram pyramid with Anushka and Joanne.