Jamshed’s Status Report 4/19/2025

Wrapped up barbell tracking feature, complete with an integrated detection model. Feature now builds a new video file with the bar-path overlay to eliminate the UI scaling issues I was having before.

App now has nearly complete functionality, with only the analysis view left. Figured out how to simultaneously read from bluetooth and record video, so that risk has been resolved. User will now record a lift (with or without camera), fill out a form specifying lift type (squat, bench, deadlift), weight loaded, weight standard (kg or lb), rep count, and perceived exertion level (1-10). Upon submission, the result is stored in an on device database.

While the lift is recording, a UI view is shown with the video (if activated) and a display that calculates the mean concentric velocity of each rep in real time. During the recording process, an array of JSON formatted data packets is being accumulated, which will then be displayed to the user on the analysis page. The video is recorded normally and post-processed with the tracker to ensure fault-tolerance of the recording pipeline (don’t want any unforeseen errors in the tracking process to jeopardize successful data capture).

Since the above communication protocol between iPhone and sensor has been defined, final integration will be a breeze so long as we adhere to it.

Decided to do a complete UI refactor leveraging SwiftUI components since  my custom UI design was confusing and annoying to work with (probably why I’m not a designer…).

My progress is on schedule. I plan to finish the app early this week, and then help with validation of the sensor accuracy.

App screenshots included below:

Object Tracking/Detection on my own training footage:

https://drive.google.com/file/d/1Fi6js1EeYNZ9FUQiGCF5WpjY_qh7NgWp/view?usp=drive_link

Team Status Report 4/12/2025

We anticipate there being some friction with integrating the video recording, processing, and bluetooth reading all simultaneously, which may or may not be out of scope depending on the progress of the sensor assembly. Additionally, we have a risk of not being able to save the calibration status of the sensor, which would result in the user having to calibrate the sensor before each lift. However, since the sensor should easily remain on long enough for a whole workout, this wouldn’t be the worst setback since calibrating is quick and easy.

We have not changed our design or schedule.

Jamshed’s Status Report for 4/12/2025

Implemented a Bluetooth Manager with Swift’s CoreBluetooth to send start/stop commands and read dummy data from ESP32. Wrote boilerplate multithreaded firmware for the ESP32 to allow our sensor device to listen for commands from the host iOS device and transmit data simultaneously. Used above firmware and bluetooth manager to create a testbench app using the Charts API to plot dummy data values from ESP32, transmitted over BLE in JSON format.

Wrote a short Python script to experiment with different methods of calculating Mean Concentric Velocity, which will be translated to Swift and run in the iOS app.

Created startup page and home page UI for the iOS app.

My progress is on schedule. Once I ensure that the BLE communication between the sensor and iPhone is robust (which I think it almost is), I plan to integrate it into the RiseVBT main app. I will also try to polish the object tracking feature and integrate it as well, as a separate feature for now.

Jamshed’s Status Report for 3/29/2025

This week I spent time troubleshooting the tracking implementation. It seems like the tracking is accurate, and the inaccuracies/drifting I’m witnessing are due to some dimensional mismatches with the video display and the overlaid tracking result. SwiftUI seems to be a little inconsistent with the video sizing, so I’m now developing a Video Writer that will overlay the tracking results frame-by-frame into a new video file, which should more clearly demonstrate accuracy.

Progress is on schedule. I hope to finish my video writer, integrate with the barbell sensors, and start developing the UI for the app next week.

Jamshed’s Status Report 3/22/25

This week, I worked on the Ethics Assignment, Object Tracking Implementation, and initial UI for the IMU data visualization.

Object Tracking now processes full video before visualization, instead of frame by frame. Some issues are arising with the tracking drifting, I haven’t figured out if they’re due to image dimension mismatches or due to tracking implementation. The tracking is generally accurate, but tends to extend a bit past the edges of movement, especially for wide aspect ratio videos. Screenshots of both scenarios are below, as well as the initial UI for the data.

Progress is according to schedule.

Next week I hope to get some real data from the IMU to plot and play around with how I want to visualize the different data values we’re recording.

 

 

Jamshed’s Status Report 3/15/2025

This week I spent time developing the UI for the data visualization, doing parts 1 and 2 of the Ethics Report, and working on the Object Tracking implementation. I’m still trying learn how the algorithms work so I can modify them for our use case.

Progress is on schedule.

I hope to be able to integrate my data visualization with the hardware next week.

Team Status Report for 3/8/2025

Overall, our main risks have to do with the parts we still have in transit. We still have yet to receive the power system for our devices. In the meantime, we’ve been testing our IMU and ESP while they’re attached to the computer. Once we receive the parts, we intend to integrate them quickly so we can get a better grasp on how our device works untethered.

We haven’t made any changes to the existing design of the system, and our schedule hasn’t changed.

Jamshed’s Status Report for 3/8/2025

This week I spent a large amount of my time working on the design report with Jason and Meadow. I also designed an app using the iOS Bluetooth API to read IMU data from the ESP microcontroller. The app can be found in this github repo: https://github.com/jpanthaki/risevbt

My progress is up to date. The object tracking computer vision feature is still a work in progress; still trying to figure out how to get the UI to work the way I want it to.

I hope to create an initial representation of the data visualization UI with graphical representation of mock IMU data, that way I can begin to integrate it with the Bluetooth feature. I also hope to make more progress with the object tracking feature.

Jamshed’s Status Report for 2/22/2025

Spent much of Sunday and Monday morning preparing for my presentation. I then took some more time playing with the iOS SDK until I was comfortable with the syntax and the XCode IDE. Developed some initial UI for the video playback feature using the Photos and Vision API. I plan to integrate an object tracker provided by the Vision API into this UI.

Now that I’ve found a solution for the object tracking using the Vision API, it’s taken a lot of the load off of the Computer Vision portion of the project, which will hopefully expedite the pathway towards integrating with the hardware.

Next week, I hope to have V1 of the object tracker working. Currently, I plan to have the user draw a bounding box around the barbell plate in their side view, so I have something to iterate on if I decide to later use a model to detect the plate automatically. I also hope to integrate with the hardware and be able to read data over Bluetooth.

Jamshed’s Status Report for 2/15/2025

Did lots of research for the design review presentation. Solidified which software stack I wanted to use, and decided on an overall design for the software architecture. Drew up some block diagrams. Spent some time with some iOS development online courses to begin my app design. Set up Django REST API.

Progress is on schedule. I plan to spend more time getting a basic iOS app up and running next week and start playing with the iOS SDK’s bluetooth API, so we can try to integrate with the sensors as soon as possible.