Joanne’s Status Report for 3.26.2022

This week I continued to work on refining the Unity- Web Application portion of the project. Last week, I got up to testing fake data on just the Rotate function via MQTT protocol. In the beginning of the week, I talked with my group members and they decided for now, that the rotation gestures will be limited to rotation about the x-axis (Up). Thus, I modified the code to account for this change. The model rotated well about the x-axis during the tests with my dummy data.

I wanted to see how the Rotate function would react to real life data, so we decided to integrate the live sensor reading with the basic gesture recognition algorithm, to my Unity web application. Right now, I had a python script that I ran from my local terminal that would publish to the public MQTT broker. Then a javascript script on the DJango web application would read those published data values, and send that data over to Unity where it would be parsed for use in order to Rotate the model. I decided to create another parsing function within the javascript script so that I could call the appropriate function (i.e. Rotate or Zoom In/Out) and also send the data in the correct format for the Unity parsing function I wrote.

After testing it with live data, the model rotated well about the xaxis. Since the gesture algorithm is not complete yet, we tested by ignoring all gesture data that was not classified as a swipe. You can see a demo of the working prototype on this google drive link below, and it will also be on our main team status report. There is also another problem that we are in the midst of figuring out about the model slightly dipping away from the correct direction, then moving in the right direction immediately afterwards, during the transition between two swipe gestures. We are currently testing to see if its a problem with the data being sent over, or if it is a problem with how I am calculating Rotation values. Also we will work on making the rotation more smooth this week.

https://drive.google.com/file/d/1h3q5Os-ycagcWNNDkVVVS57qLM31H1Ls/view?usp=sharing

Lastly, I finished up a basic zoom function call. So now the model can zoom in and zoom out on calls to the Unity function from the Django web application. I am still figuring out how to zoom in based on how fast the user is pinching in and out. Right now it is just zooming in and out by a constant factor every time the function is called. Refining these two functions will be something I will be working on this week. Since we still are trying to figure out zoom out gesture detection, we have not been able to test the zoom functionality with live data, but that will also be another goal for this week.

 

Leave a Reply

Your email address will not be published. Required fields are marked *