Team Status Report for 3.26.2022

We attended the ethics discussion on Wednesday. It was nice hearing what other teams are doing outside of section D and hearing their feedback on our project. The discussions did push us to think about the ethical implications of our project, which we didn’t think was a lot before. However, to be fair to our users, we will be more clear on what data we are collecting and how we are using it.

This week, we worked on a swipe detection demo and improving our gesture recognition algorithm. We are having issues with our algorithm. Since we are trying to avoid using machine learning, we are testing out different curve fitting equations and changing the parameters of how we’re collecting our data. We are limited to how fast our sensors can collect data, which is not good when a user performs a gesture quickly. We are currently collecting the accuracy when tuning hyperparameters and will map out these over the next week to determine which parameters offer the highest accuracy.

Most significant risk is the algorithm at the moment. If we are able to get the Jetson working next Wednesday, we might be able to afford better parameters for a higher accuracy, even if it takes longer to compute since we have a faster, external computer. The Jetson might change to a Raspberry Pi if we aren’t able to get the Jetson working, but we aren’t going to make that decision right now. The schedule has changed, since we weren’t expecting to be having this much trouble with the Jetson. However, this is why we allocated a lot of time for it.

We have started integration between the basic gesture detection from live sensor readings, and the web application portion of the project. We streamed data via MQTT protocol to the Django app, and confirmed we could get data into Unity and that it can rotate the model. A demo of the rotation on live data is shown below in the video on the google drive link: https://drive.google.com/file/d/1h3q5Os-ycagcWNNDkVVVS57qLM31H1Ls/view?usp=sharing.

We are continuing to work on smoothing out the rotation based on sensor data since it doesnt rotate as smoothly as it should (although it does rotate in the correct direction) and we are planning to test out the zoom in/out based on live data soon this week as well.

Leave a Reply

Your email address will not be published. Required fields are marked *