Jamshed’s Status Report for 4/12/2025

Implemented a Bluetooth Manager with Swift’s CoreBluetooth to send start/stop commands and read dummy data from ESP32. Wrote boilerplate multithreaded firmware for the ESP32 to allow our sensor device to listen for commands from the host iOS device and transmit data simultaneously. Used above firmware and bluetooth manager to create a testbench app using the Charts API to plot dummy data values from ESP32, transmitted over BLE in JSON format.

Wrote a short Python script to experiment with different methods of calculating Mean Concentric Velocity, which will be translated to Swift and run in the iOS app.

Created startup page and home page UI for the iOS app.

My progress is on schedule. Once I ensure that the BLE communication between the sensor and iPhone is robust (which I think it almost is), I plan to integrate it into the RiseVBT main app. I will also try to polish the object tracking feature and integrate it as well, as a separate feature for now.

Jamshed’s Status Report for 3/29/2025

This week I spent time troubleshooting the tracking implementation. It seems like the tracking is accurate, and the inaccuracies/drifting I’m witnessing are due to some dimensional mismatches with the video display and the overlaid tracking result. SwiftUI seems to be a little inconsistent with the video sizing, so I’m now developing a Video Writer that will overlay the tracking results frame-by-frame into a new video file, which should more clearly demonstrate accuracy.

Progress is on schedule. I hope to finish my video writer, integrate with the barbell sensors, and start developing the UI for the app next week.

Jamshed’s Status Report 3/22/25

This week, I worked on the Ethics Assignment, Object Tracking Implementation, and initial UI for the IMU data visualization.

Object Tracking now processes full video before visualization, instead of frame by frame. Some issues are arising with the tracking drifting, I haven’t figured out if they’re due to image dimension mismatches or due to tracking implementation. The tracking is generally accurate, but tends to extend a bit past the edges of movement, especially for wide aspect ratio videos. Screenshots of both scenarios are below, as well as the initial UI for the data.

Progress is according to schedule.

Next week I hope to get some real data from the IMU to plot and play around with how I want to visualize the different data values we’re recording.

 

 

Jamshed’s Status Report 3/15/2025

This week I spent time developing the UI for the data visualization, doing parts 1 and 2 of the Ethics Report, and working on the Object Tracking implementation. I’m still trying learn how the algorithms work so I can modify them for our use case.

Progress is on schedule.

I hope to be able to integrate my data visualization with the hardware next week.

Jamshed’s Status Report for 3/8/2025

This week I spent a large amount of my time working on the design report with Jason and Meadow. I also designed an app using the iOS Bluetooth API to read IMU data from the ESP microcontroller. The app can be found in this github repo: https://github.com/jpanthaki/risevbt

My progress is up to date. The object tracking computer vision feature is still a work in progress; still trying to figure out how to get the UI to work the way I want it to.

I hope to create an initial representation of the data visualization UI with graphical representation of mock IMU data, that way I can begin to integrate it with the Bluetooth feature. I also hope to make more progress with the object tracking feature.

Jamshed’s Status Report for 2/22/2025

Spent much of Sunday and Monday morning preparing for my presentation. I then took some more time playing with the iOS SDK until I was comfortable with the syntax and the XCode IDE. Developed some initial UI for the video playback feature using the Photos and Vision API. I plan to integrate an object tracker provided by the Vision API into this UI.

Now that I’ve found a solution for the object tracking using the Vision API, it’s taken a lot of the load off of the Computer Vision portion of the project, which will hopefully expedite the pathway towards integrating with the hardware.

Next week, I hope to have V1 of the object tracker working. Currently, I plan to have the user draw a bounding box around the barbell plate in their side view, so I have something to iterate on if I decide to later use a model to detect the plate automatically. I also hope to integrate with the hardware and be able to read data over Bluetooth.

Jamshed’s Status Report for 2/15/2025

Did lots of research for the design review presentation. Solidified which software stack I wanted to use, and decided on an overall design for the software architecture. Drew up some block diagrams. Spent some time with some iOS development online courses to begin my app design. Set up Django REST API.

Progress is on schedule. I plan to spend more time getting a basic iOS app up and running next week and start playing with the iOS SDK’s bluetooth API, so we can try to integrate with the sensors as soon as possible.

Jamshed’s Status Report for 2/8/2025

Worked on designing presentation and prepping Jason early week. Spent ample time researching software stack and evaluating tradeoffs for different implementations. Setup github page for application. Began to set up development environment and refreshed my skills with Django and React.

Progress is going according to schedule. Design iteration is still ongoing as I decide which software stack is feasible and/or applicable to our application.

Next week, I will have a more finalized iteration of the software design. I also will collaborate with Jason and Meadow on the overall design and complete and rehearse our design presentation slides.