Meadow’s Status Report for 4/26/2025

This week I assembled the final presentation slides and rehearsed the presentation. I also met with Jason and Jamshed to test the full integration of the system and integrated Jason’s Bluetooth code into my sensor code and cleaning up some of its framework. I was able to fully assemble our first official prototype of the hardware part of our project as well, which will have a revamp next week to look more professional for the demo. I wanted to resolve the issue of not being able to customize the display screen yet, but unfortunately ran out of time, so next week that will be on my list of chores before the demo.

A little behind schedule, because I’d like to have the screen showing by now, but it shouldn’t take too long to resolve.

Here’s a visual of the clip:

Team Status Report for 4/26/25

Our schedule has not changed. In regard to risks, after testing in a gym setting while attaching the device to a barbell and streaming Bluetooth connection with the iOS app, one risk that was identified was the concentric vs. eccentric parts of a rep being correctly identified by the sensor given its sensitivity to quick movements and tendency to overcorrected to negative values after a concentric (positive) movement in order to stabilize to zero after movement has been stagnant. We are working around this risk by possibly simplifying our concentric vs eccentric logic. We also ran into some bugs regarding passing video URLs between SwiftUI views, which should be resolved promptly.

Testing:

Unit Tests, Software:

  1. Object Tracking Tests: went from drawing overlay in the UI to writing a new video file containing the overlay, to minimize dimensional mismatches that were occurring during testing. Integrated object detection ML model to improve tracking accuracy and ease of use. Ended up with a completed pipeline that takes in recorded lift footage and outputs a new video file with overlay.
  2. BLE Communication with Dummy Data: Initial communication protocol using JSON was found to be far too slow. In order to maximize efficiency with the limited bandwidth, we switched to a simpler communication protocol with a fixed length struct.
  3. Backend Storage Test with Analysis View: Tested creating database entries with dummy data, which was plotted in the analysis view.

Integration Tests, iOS App

  1. UI workflow tests: Found that initial UI design was clunky and unintuitive. Decided to leverage SwiftUI components to build a custom UI that was more user friendly.
  2. Simultaneous BLE communication and video recording: With a bit of testing, ended up working seamlessly. Only issues that remain are regarding storage of captured video, which is more of a backend issue anyway.

Integration Tests, Full System

  1. Sensor Data Output: App is functional in sending start/stop commands to sensor device, accurately plots data output from device. Some issues with video storage as previously mentioned.

Jamshed’s Status Report 4/26/25

This week I spent time integrating the object tracking feature and polishing the UI. I got an analysis view working with a graph of captured velocity values. Running into a bit of a bug regarding storage/recall of recorded videos, which I hope to have resolved ASAP.

Progress is still on schedule. The app will be operational by demo.

Jason’s Status Report for 4/26/25

A majority of my work this week was spent with Jamshed completing the integration of the sensor with the iOS application.  We spent a decent amount of time figuring out how we wanted to send the mean concentric velocity (MCV) during the user’s rep into the app for them to see. We decided on sending the MCV at the very top of a rep with our movement detection but we found some difficulty doing this after testing on a barbell during the week. We are currently in the stage of troubleshooting this feature but hope to be able to have it figured out by tomorrow. I also spent some time this week integrating Meadow and I’s code as I focused on the Bluetooth setup and protocol with the iOS application while Meadow focused on the calibration of the sensor and saving its calibration status. I was able to integrate the two but we also found some issues with this while testing this week. We decided to have it so that the sensor and esp32 do not begin Bluetooth streaming until the IMU device is fully calibrated. I have also been working on a version that only detects and send the MCV at the top of the rep but does not send any eccentric movement as the sensor has been very sensitive to positve (concentric movement) leading to overcorrection in velocity data which is negative which has complicated the concentric/eccentric logic of the sensor detectection within a rep.  Nonetheless, I am still optimistic that we will be on track for the demo on Thursday as we will continue testing all week until demo day.

Team Status Report for 4/19/2025

Currently we don’t see any risks that would jeopardize our project’s success.

We made a design change to only use 1 sensor instead of 2 after realizing our data would report the same values for both sensors, so using 2 would just be wasteful. This not only simplifies our hardware-software integration, but reduces the overall cost of our design by 50%.

Our schedule has not changed.

Meadow’s Status Report for 4/19/2025

I was finally able to figure out what was wrong with saving calibration status. Now I’m able to calibrate, power off, then upload to see fully calibrated accelerometer and gyroscope. However, there is a slight hiccup that the magnetometer does not save calibration and I’m not quite sure why that is yet. Calibrating it only involves slightly moving the sensor, so even though it starts at 0 when restoring calibration, it easily reaches 3 (fully calibrated) with minimal motion. In the context of our device this shouldn’t be an issue but I will look for some fixes next week just in case. All of my sensor testing was done with an Arduino, but I just cleared our ESP to work the same way with our scripts. Because we chose an Adafruit ESP with a screen as our secondary component, I also was working on displaying battery status when the device is on. There isn’t as much documentation on how to do this as I was hoping, so it’s not working quite yet but with a bit more research I should be able to reflect the message on the screen (for now it successfully prints in the basic serial monitor). I also began to work on the final presentation and develop the physical build of the sensor. Lastly, I spoke with the team about how we’d want to represent the balance metric and we decided on graph variation that will maintain a steady state until it senses a tilt, in which the output should move in either +y or -y to indicate imbalance (fairly basic).

I am a bit behind on some of the testing procedures due to setbacks in the data department, but it shouldn’t be too hard to catch up after completing the device assembly.

So, briefly, next week I’d like to: attempt to resolve magnetometer not restoring calibration, debug battery visual appearing on ESP screen, and engage in some of our testing procedures.

Jamshed’s Status Report 4/19/2025

Wrapped up barbell tracking feature, complete with an integrated detection model. Feature now builds a new video file with the bar-path overlay to eliminate the UI scaling issues I was having before.

App now has nearly complete functionality, with only the analysis view left. Figured out how to simultaneously read from bluetooth and record video, so that risk has been resolved. User will now record a lift (with or without camera), fill out a form specifying lift type (squat, bench, deadlift), weight loaded, weight standard (kg or lb), rep count, and perceived exertion level (1-10). Upon submission, the result is stored in an on device database.

While the lift is recording, a UI view is shown with the video (if activated) and a display that calculates the mean concentric velocity of each rep in real time. During the recording process, an array of JSON formatted data packets is being accumulated, which will then be displayed to the user on the analysis page. The video is recorded normally and post-processed with the tracker to ensure fault-tolerance of the recording pipeline (don’t want any unforeseen errors in the tracking process to jeopardize successful data capture).

Since the above communication protocol between iPhone and sensor has been defined, final integration will be a breeze so long as we adhere to it.

Decided to do a complete UI refactor leveraging SwiftUI components since  my custom UI design was confusing and annoying to work with (probably why I’m not a designer…).

My progress is on schedule. I plan to finish the app early this week, and then help with validation of the sensor accuracy.

In order to build my portion of the project, I had to spend a lot of time learning about computer vision, machine learning, ESP32, BLE, and the iOS SDK. I spent lots of time scouring internet resources and developer documentation and leveraging generative AI as an assistant in my learning.

App screenshots included below:

Object Tracking/Detection on my own training footage:

https://drive.google.com/file/d/1Fi6js1EeYNZ9FUQiGCF5WpjY_qh7NgWp/view?usp=drive_link

Jason’s Status Report for 4/19/25

After specifying and discussing the communication protocol with Jamshed last week, this week I spent the first part of my time implementing said protocol. This was done entirely in the Arduino code for the ESP32 so that it communicates directly with Jamshed’s iOS platform without a Python script acting as a middleman. This protocol specified how the data would be sent and in what time interval it would be sent.  We also decided on an external start command for the ESP32 to wait for in order to begin transmitting data from the BNO IMU sensor it is connected to. This is separate from the internal start logic that I implemented, which solely depended on a threshold of velocity to be exceeded. We tested this communication protocol out during class time on Wednesday, after I had finished implementing it and we were able to successfully integrate the two parts together, with communication working on Jamshed’s phone while testing. The only setback we had was the data accuracy, particularly of the velocity metric. I then spent the rest of the week trying to fix this data discrepancy, which required some further research into sensor fusion and Kalman filters. I also am rewriting my python script which has to be updated based on the new communication protocol, but will use this as a method of less-complex testing in order to simulate communication with the phone with a focus on checking velocity data accuracy since we know that the devices’ communication integration should already be good to go for demo day. I believe we are on track once I finish up fine-tuning the velocity data accuracy, and should be good to begin filming our demo video within the week.

Team Status Report 4/12/2025

We anticipate there being some friction with integrating the video recording, processing, and bluetooth reading all simultaneously, which may or may not be out of scope depending on the progress of the sensor assembly. Additionally, we have a risk of not being able to save the calibration status of the sensor, which would result in the user having to calibrate the sensor before each lift. However, since the sensor should easily remain on long enough for a whole workout, this wouldn’t be the worst setback since calibrating is quick and easy.

We have not changed our design or schedule.

Meadow’s Status Report for 4/12/2025

Since the last status report I’ve worked extensively trying to get calibration to save on the sensor, but due to the lack of documentation it has been difficult to make work. I’ve found a couple resources of claims to have done so, but even running these examples explicitly doesn’t result in a saved calibration. But I will continue to try methods in the coming week in case I can fix the issue. Aside from that, I’ve discussed the physical build of the hardware portion with Jason and we’ve come to a general consensus on it. We’re currently waiting for a key part for that, but after arrival we should be able to go to TechSpark and assemble holdings for the parts on our barbell clip. Finally I started exploring how to transmit our hardware data to the user in a presentable and easy to understand manner. Although we’re using Jamshed’s app for this, I wanted to be able to test ways to display things like balance before integrating it into the app. We were never going to use a full Arduino for the final build and our new ESP32 models arrived (what we will be using), so I got started on the setup between the new ESP32 and the BNO055 sensor using our existing Arduino script in the IDE. In short, I added the ESP32 as a board in Arduino so we can run our script in the same way, just with the right components. There’s an Arduino IO site that I also began setting up so I can try experimenting with sending the data that comes from my script into visual feedback, similarly to how it should be in our app. I think I’m a bit behind on having the sensor features working well enough to be sent directly to the app, but hopefully I can close some of those gaps in the coming week and a half before the final presentation. So, next week I hope to figure out the best way to display balance to users and make sure that Jamshed is able to read the values output by the BNO into his app in the same way we see them printed from my IDE.