In terms of the hardware side of things, at this point in the project after achieving the initial setup of the IMU sensor + Bluetooth microcontroller one concern and/or risk for the project is if the IMU’s data will be easy to process and send over to the actual software platform. Although the data was captured and extracted already, this was a continuous stream of data that was infinite. One risk is focusing on this data and the particular relevant data that we will have to achieve, and if not achieved, could risk the project’s success. This will be done mainly with the software platform and firmware as a means for managing this risk by including a manually integrated user-starting point through the software platform.
We also spent some time researching the iOS Photos and Vision APIs and developed an initial UI for the video playback portion of the iOS app. Ultimately, we want to integrate the object tracker (provided by Vision API) into this playback screen so the user can see their bar-path.
The only change that we have prepared for in terms of the hardware side of the design is using a fully integrated MCU for the IMU sensor + Bluetooth microcontroller. We have ordered the necessary MCUs for this but this week we simply worked on iterating on our original hardware design and its setup so that when we implement and setup the MCUs that we ordered, the firmware will already have been written and the setup will be seemless.
Below is some photos of the hardware sensor+bluetooth microcontroller progress that we made this week after completing initial sensor setup and Bluetooth integration as well as IMU data extraction:
Hardware setup:
IMU Data while calibrating:
IMU data while at rest (semi-suspended):
iOS app: