Wrapped up barbell tracking feature, complete with an integrated detection model. Feature now builds a new video file with the bar-path overlay to eliminate the UI scaling issues I was having before.
App now has nearly complete functionality, with only the analysis view left. Figured out how to simultaneously read from bluetooth and record video, so that risk has been resolved. User will now record a lift (with or without camera), fill out a form specifying lift type (squat, bench, deadlift), weight loaded, weight standard (kg or lb), rep count, and perceived exertion level (1-10). Upon submission, the result is stored in an on device database.
While the lift is recording, a UI view is shown with the video (if activated) and a display that calculates the mean concentric velocity of each rep in real time. During the recording process, an array of JSON formatted data packets is being accumulated, which will then be displayed to the user on the analysis page. The video is recorded normally and post-processed with the tracker to ensure fault-tolerance of the recording pipeline (don’t want any unforeseen errors in the tracking process to jeopardize successful data capture).
Since the above communication protocol between iPhone and sensor has been defined, final integration will be a breeze so long as we adhere to it.
Decided to do a complete UI refactor leveraging SwiftUI components since my custom UI design was confusing and annoying to work with (probably why I’m not a designer…).
My progress is on schedule. I plan to finish the app early this week, and then help with validation of the sensor accuracy.
App screenshots included below:
Object Tracking/Detection on my own training footage:
https://drive.google.com/file/d/1Fi6js1EeYNZ9FUQiGCF5WpjY_qh7NgWp/view?usp=drive_link