Team Status Update 12/9

This week, we worked on finalizing our code for the project. Rosina and Saumya worked on expanding the code for the IMU to the z-axis (decoupling x and z and creating separate Kalman filters for each), and flipping the direction of the z-axis. Sarah practiced and presented our final presentation slides and took the lead on making the poster, while Saumya and Rosina helped write the different sections of the software portion.

One interesting problem we solved was the failsafe triggering. We tried to make the failsafe interrupt the IMU to avoid as much drift as possible and “freeze” the mouse in place when the failsafe was let go. No considerable changes have occurred to our design, and we are seemingly on track with the rest of the project. There are no risks to us presenting what we planned to present in the demo.

To answer the ABET question:

Latency: To test our latency requirement, we manually started a timer while at the same time triggered one of the sensors on the glove. In this way, starting the timer would be manual, and the timer would stop when the gesture propagated through. We achieved an average of 40ms for keystrokes.

We noticed a ton of latency (on the order of seconds) when testing our mouse movements, so we tried to reduce the amount of calls to pyautogui and reduce the amount of computation. Specifically, if the mouse position did not change between IMU data arrivals, we didn’t call pyautogui. This improved latency of the mouse to below 1s.

Weight: Unfortunately, we were unable to reach our weight use case requirement, weighing in at ~20g over our budgeted weight. We prioritized battery life as it is more inconvenient for the user to recharge their device multiple times during a session. Additionally, 20g is not very significant in the realm of human detection. Because of this immense increase in battery life, we decided this trade off was well worth it.

Accuracy: To test accuracy, we conducted a user study in which we guided each user through all the available gestures on the glove. Participants were asked to carry out each gesture and we recorded whether this gesture produced the expected shortcut. We were able to achieve a 100% accuracy after many iterations of sensor positioning.

Wireless Range: To test wireless range, we slowly incremented the distance between the glove and the device it was paired to. For each ½ a foot, we triggered one of the sensors and recorded whether the desired gesture was produced. We found that the final distance that triggered a gesture was 3.05 meters (10ft).

Battery Life: To test battery life, we connected our product to a battery of known voltage, current, and battery capacity. We continuously sent out packages over Bluetooth until this battery ran out. Since we are using Bluetooth Low Energy (BLE), we were able to achieve a battery life of over 12 hours for a 5V, 2.1A, 10,000mAh portable charger.

Saumya’s Status Update 12/9

This week I worked on finalizing code for our demo as well as the final presentation slides and poster. I created a final flow chart that contained a full description of the software architecture we implemented. I also worked on expanding the IMU code to the Z-axis and translating this to the movements onscreen. I also worked on finishing up the GUI.

In terms of the Gantt chart, we are right on track to finish by the demo. We just have to code the remaining parts of our project (GUI, touching up IMU movement, etc) and practice for our demo on Monday.

Team Status Update 12/2

This week, we started integrating our hardware with the glove and thresholding sensors now that they are in their final positions. The software portion of our team worked extensively on IMU mouse movement algorithm. Our product is now able to move the mouse left and right, and we plan to extend this code to vertical movement along the z-axis in the early half of this week. We also worked this past week on our slides for the final presentation.

No major changes are being made to the device, but the IMU code may impose some limitations on how long the mouse can be used before it needs to recalibrate. The Bluetooth protocol was also modified slightly to send IMU data using a bytearray rather than an array of floats due to compatibility issues with the ESP bluetooth library we were using.

Saumya’s Status Update 12/2

This week, I worked on code for reading in the IMU data and mapping this to mouse movements. Working with Rosina, I experimented with the calibrated linear acceleration data coming from the IMU. We noticed that our data was decently accurate thanks to the IMU’s software for bias-reduction and combating drift. We used this data accuracy to our advantage when designing our position calculator. We utilized a 3-state Kalman filter with position, velocity, and acceleration as our states. On each iteration of the filter, we used a double integration method to calculate the position.

When experimenting with this, we noticed that the Kalman filter resulted in a good amount of smoothing of our position curve, but the position would shift back a little bit due to our velocity values. To correct for this, we implemented code that would snap the velocity to zero if several acceleration measurements of 0 were detected. That way the drift would be minimal. When we implemented our mouse movement code, this resulted in pretty good movement. In order to make this as accurate as possible, we plan on using our failsafe button as an interrupt to snap the mouse position in place and avoid the recoil. Now that this works for a single dimension, we plan to implement a second kalman filter to handle vertical movement at the same time.

In terms of the Gantt chart, I still have a lot of progress to make towards polishing our product, but a lot of the main challenges are completed.

Saumya’s Status Update 11/18

This week, I worked on completing the networking code for keystroke detection as well as completing a UI for users to attribute gestures to specific keystrokes. I came up with the idea of using spotlight search on the Mac to quickly open common websites for viewing movies (ex. Netflix, Hulu, YouTube), and I played around with polling and delays to get a really quick connection between the sensor movement and the keystrokes. By Rosina’s suggestion, I also added a longer delay when a gesture was registered compared to when no gesture was registered to avoid the system registering the same opcode multiple times on a single press.

One bug I had to fix dealt with the pyautogui hotkey function, which is supposed to handle keyboard shortcuts (like pressing ctrl+z). I noticed that it sometimes didn’t press the keys at the same exact time and the shortcuts were not always carried out. To fix this, I wrote my own version using other pyautogui functions that ensured that all keys stayed pressed at the same time. This likely is less optimized than the hotkey function, but the latency difference was undetectable, and it worked with good reliability. I also worked on a protocol for sending IMU data. If the fail-safe is triggered, while it stays triggered, I’ll continuously transmit IMU data and have the mouse move in the IMU data callback function. This failsafe condition will be checked on the ESP before any of the other sensors are checked to ensure that other sensors are ignored when the failsafe is pressed.

In terms of the Gantt chart, I am on track based on my tasks. I should be ready to integrate with Rosina’s part and hopefully we can have a finished product ready to user test by the end of Thanksgiving break.

Saumya’s Status Update 11/11

Saumya’s Status Update 11/4

This week, I was able to establish a BLE connection between the ESP32 and our Python receiver, and I worked on a simple script that created a connection between the Python client and the ESP32 server (BLE uses server/client terminology in a different way from the traditional definition). I spent some time exploring creating the connection via an MQTT server since BLE hadn’t been working, but ultimately, I got the code to work and found the BLE code to be simpler for demo purposes. 

I’m also currently working on the interim demo. I’m making a more specific flow chart of what will be processed locally and what specifically will be sent to the Python receiver to get our glove to transmit data and actually get our sensors to register keystrokes and movement onscreen.

With regards to the Gantt chart, I think I’ve caught up a little bit, but not having our IMU for a while and the BLE issues set back our tasks slightly. Having this BLE communication protocol done will allow me to finish keystroke detection likely within the week so Rosina and I can really focus on the IMU portion.

Team Status Update 10/28

This week, we planned out our code for when we receive our final set of sensors. Sarah also finished routing and milling our PCB. In terms of the risks that our project currently faces, we still haven’t received our IMU, so we can’t start fully coding that part yet until we can figure out and play around with the IMU’s calibration and sensor outputs. We are also facing issues with the PCB – specifically, Techspark’s PCB mill was not accurate enough for our small PCB, and we identified shorts between the traces on the board. We are planning on either using a vector board or ordering our PCB, but with the increased shipping time, we are a little bit worried about how this will affect the timeline of our project. We plan to make our decision about the board by Monday when we can place orders, and update our timeline accordingly if necessary. 

Saumya’s Status Update 10/28

This week, I worked on trying to get communication between the ESP32 and our Python receiver to work via BLE. I also helped Sarah talk through some ideas while she worked on our PCB. I wasn’t able to get Bluetooth to work with my M1 even using a different Python package, so I may fall back on our backup plan of either using a VM/different OS or trying WiFi with the ESP32. I also worked on the ethics assignment individually and with my group for the group discussion portion. I was unfortunately sick for most of the week as well, so I am a bit behind schedule with regards to our Gantt chart. To get back on track, I will spend the first half of the week catching up with my other classes including a makeup exam until Tuesday, and then dedicate the rest of the week to working on capstone. During my breaks in the first half of the week, I’ll also read up on WiFi projects and set up my VM so I can get started with our backup plans as soon as Wednesday hits.