Isabel’s Status Report for 4/16

This week, I couldn’t be there in-person for the demo due to COVID, but I was still able to work on the project remotely thanks for my Arduino at home. I could also join zoom remotely during lab time, so I was able to help my teammates with running the full pipeline. I’ve been mostly debugging with our team and helping come up with new ideas to improve our pipeline. I was considering setting up some sort of sound system to guide the user through the locking and pairing procedure. Also, I was able to solder and build the lidar circuit when I got out of quarantine on Friday, and got it running with some test arduino code from this website: https://github.com/TFmini/TFmini-Arduino/blob/master/TFmini_Arduino_SoftwareSerial/TFmini_Arduino_SoftwareSerial.ino which fixed a checksum issue I was having with the basic reading code that shipped with the tfmini library.

With this lidar setup, I’m almost back on schedule. I hope to finish integrating the code by Monday, so that then I will be completely caught up and we’ll be thoroughly in the refinement stage for this last week.

 

Over the next week, I will complete any debugging or improvement patches to our calibration code, finish integrating the lidar, and help out with the user study we will be conducting in the second half of the week. I will also help with the completion of the final presentation slides.

Olivia’s Status Report for 4/16

The past week, I have been working on testing out the overall system. This includes running the calibration system then running the program that follows the head movement. Testing the overall system helped us find some bugs in our system and intuition as to how to fix the bugs. In particular, we found the “user offset” and “user wall-to-distance” to be incorrect at times. We are currently working on resolving this issue. 

When we were testing the overall system, we found the motor to move in an annoying fashion when we moved our head only slightly. This makes sense, since we did not add the ‘threshold’ in yet to prevent movement with small head movements. I am currently working on fixing the CV program to deal with these small thresholds by averaging the past 5 head pose estimations. I also got a new camera to add into the system that should work in dim lighting. My goal over the next week is to continue to test the overall system and tweak it to ensure it offers a clean user experience. We also plan to edit and run the user survey next weekend when the new motor gets delivered.

Rama’s Status Report for 04/10

This week I have been waiting on orders for parts to come in so I can start putting things together. In the meantime, I have helped figure out what we should test in the user study and cleaned up my code and added documentation. I think that I am on schedule but if the parts take any longer then I will be behind because we won’t be able to properly get feedback on the user experience. This upcoming week I hope to start building so that we have a full setup for the demo on Wednesday. I also want to help with the refinement of values for calibration and motion as we continue to improve and speed up the full pipeline.

Team Status Report for 4/10

  1. This past week we continued to test the full end to end pipeline as the calibration code was completed. The system responded to hard coded information about where the user is in the room to set up where the projector should start. We also tested a possible lock/unlock feature for calibration using blinks. If the user blinks 3 times then the system locks and the motor will not respond to head movement. The screws/mounts for attaching the projector have been ordered but in the meantime we will use heavy duty foam tape to test out motion with a picture. The next step is to do a user study on at least 10 participants. What we are looking to learn is which easing type is most suitable for the user (linear, quadratic, or cubic motion). We also would like to see which speed is most comfortable (30,60, or 90 degrees per second). Since the projection only needs to move for significant motion, we will need to figure out what the minimum displacement of the projection should be to trigger a movement in the motor. Lastly, we would like to see which lock/unlock feature is best and if we need to add other lock/unlock features to better the experience. The main risk is that the motion of the projection ends up being too shaky or jerky in which case we will need to slow down the movement and use a different method of securing the projector. We will continue to refine threshold values and gather feedback as needed.

Olivia’s Status Report for 4/10

Earlier this week, the Wyze v3 camera was finally delivered.  I spent time trying to connect the camera to my laptop which required flashing the Wyze camera firmware. However, once this was done, the settings of the camera could not be changed to the specific night vision settings that are normally accessible in the Wyze app. Since we need a camera that works in dim lighting, this camera will not be sufficient. I put in a new order request for the Arducam 1080P Day and Night Vision camera which is supposed to connect easily to an Arduino or laptop and should be delivered this week.

I have been editing my head pose estimation program to make it more robust to natural human movement. For example, I have been handling the case where the person’s head goes out of frame and their head pose can no longer be detected. I still need to handle the case where a person makes an unintentional movement (such as sneezing). In this case, we do not want the system to follow the movement of the person’s head. I have also spent time creating our user experience survey which we plan on conducting this Friday, hopefully. We will be testing specific settings such as motor speed, lock gesture commands, movement thresholds, and overall user experience. I may add a few more items to test in the user survey as I continue to edit my program to make it more accommodating to natural body movement.

Isabel’s Status Report for 4/10

Earlier this week, before Wednesday, I completed the full new version of the calibration state machine and tested to make sure the stages were switching through correctly. I also made a python testing suite to test all the mathematical functions I was using within the arduino code-the reason being, in python it’s much easier and faster to print out debugging material, and debugging trig and geometric conversion code can be a headache at the best of times, so I needed as much feedback as possible. On Wednesday during our practice demo, I noticed the locking mechanism happens a bit too fast for a user to properly interact with it, so I designed and implemented a mechanism within the calibration phases to provide some delay between the user locking the projector and the program actually pairing the user/projector angles with each other. I made a diagram of this process:

Now the “L” still stands for lock, and stops the motors, the “S” means the python program is ‘sleeping’, and the “P” signals for the arduino program to pair the angles. The reason I did this in stages is to leave room for a possible python program addition that counts down the seconds until pairing so the user knows they should look at the projector.

Next week, I need to explain to my teammates how to set up the physical system and work the calibration program. I can’t leave my home until Friday because I’ve got COVID, and I’ve been pretty sick but have been slowly recovering. The good thing is the bulk of this is built and debugged, we just need to get the physical setup going now. Our goal is to get this ready by the Wednesday demo, and I will support this remotely in whatever way I can.

I’m still on schedule for everything except the lidar. Since I had to isolate this weekend, I couldn’t go into the lab and make the lidar circuit. I’ll have to do this next weekend.

Isabel’s Status Report for 4/2

In addition to redrawing the state machine for the calibration (shown Wednesday), I’ve since been working on refining the linear motor movement program. Rama has been giving me her interface notes and functions that she wrote for motor movement, so I’ve been incorporating those with the arduino state machine to move the motor vertically as well as horizontally. Finally, I’ve added a fix for the issue of the motor not stopping between the user turning their head by adding a ‘center’ signal between ‘left’ and ‘right’, which sends a stop signal. I’ve also integrated Olivia’s code for the locking signal, and tested that it makes it over to the arduino and registers correctly (the ‘locked’ means locked, and the ‘U’ means unlocked):

Something I’ve been considering with Olivia is how to improve the accuracy of our locking gesture. Today I had a friend with thick glasses test the program, and it had trouble detecting the three blinks. We might consider a hand gesture or head nod as an alternative to this.

Over the next week I will continue testing each stage of the calibration process and continue making refinement notes (which I am tracking in a document), so we can consider these refinement notes when we do end-to-end user testing after this next week.  Some examples of these notes are:

-send floats over serial port rather than int

-Consider the thresholding for the first head movement phase

-Decrease the threshold for blink detection

And so on. My goal is to finish the testing by the end of the week, and hopefully clean up the code a little so it’ll be more readable to my teammates.

Currently I’m on schedule for the calibration, although the testing and refinement phases have been merging in with each other more than expected. The lidar just arrived, and my plan as of now is to integrate it next weekend after I can walk my teammates through the current calibration code. This is mainly so any roadblocks with the lidar don’t jeopardize the demo, since we’ll do all the preparation for that this week.

Olivia’s Status Report for 4/2

This week I implemented the “lock” gesture which occurs when someone blinks 3 times in a span of 2 seconds. This lock gesture will work in conjunction with the calibration process when a user “locks” the projection in place to confirm its location is correctly attuned to the persons gaze. I implemented this gesture using four Mediapipe landmarks around both the left and right eye. In each frame the program processes, I calculate the “eye aspect ratio” and determine if it is below a set threshold. If so, a blink is detected. I also spent time cleaning up my code and making it best suitable to work with the overall system. Sadly, I got the flu the past week which made it very difficult for me to get any other work done or go to class. I personally feel behind schedule because I have not met with my team in person for a bit. I hope that changes this next week when I am finally able to pick up the camera, connect it to the overall system, and see the progress we have recently made with my own eyes.

Team Status Report for 4/2

The past week, we have successfully integrated the entire pipeline together. This means the system can convert the movement of a user’s head into the movement of a motor. The next step is to add the projector onto the motor and conduct a user study to determine how we can tune our system to provide the best user experience. This will involve gathering 10 people and asking them a range of questions after testing out our system. This will be are biggest tackle for the rest of the month. It is essential our system has a positive user satisfaction by the end of this project. We will be taking our peer’s advice very seriously and altering our implementation as needed. We have also created a Trello board to organize all of our tasks for the rest of the semester, since from now on we’ll be working closely with the full system implementation and need to maintain good communication between the team.
The main risk is that we may not be able to meet all of the usability requests from our participants. If this happens, we will triage the ones that we consider to make the largest impact.
Additional changes were made to the calibration program, with the state machine being redrawn and a lot of the code being rewritten to cope with the data flow. We’re now building the system and refining it with preliminary user tests bit by bit, and making refinement notes, to adapt to these changes.

Rama’s Status Report for 4/2

This week I continued to fine tune the motor functions and finalize orders for parts. We were able to do some end to end testing where a person moving their head either to the left or the right during calibration would make the motor move all the way to the left or the right. The motor moves all the way in either direction because the boolean variable needed was not finished yet, but we were happy to see the full flow of the system. I did feel behind schedule at the beginning of the week but after seeing the full end to end, I think now that the base of the functions are done for my part, it is only a matter of adjusting the values to match head movement the best. Whats to come is doing a “user study” on at least 10 people for angle, speed, and easing types. I also need to secure the pan and tilt system to a projector stand, and the projector the mount so I will make some more orders for parts as well.