This week, I started full dataset training for the YOLOv8 OBB model. I implemented the loss function using three tunable hyperparameters that adjust the weight of the bounding box regression loss, the classification loss, and the angle loss. The intent of this was so to show more transparency in the unified loss calculation (the sum of all the previously mentioned loss), and if one loss was too high I could adjust how much the penalty would be. This method would hopefully allow us to better control the model’s convergence and allow for a better check pointing scheme that would save each best model in all four loss types (unified, regression, classification, and angle) to retrain later on. While the model started to train, I started working on creating a Docker image for the Jetson to make porting over the model easier.
Next week, working on some visualization code that would plot the predicted bounding box and class on static images in a clean manner (the core logic of this will be scaled out later for the Jetson)
Currently, I am on schedule.