Team Status Update for 04/25 (Week 11)

Progress

This week, we worked on the final demo; creating videos so it is clear what we have accomplished so far. We are almost done with the engineering part of the project, and we are collecting performance metrics. We created the final presentation and planned out what we will use for the final demo vide.

Deliverables next week

Next week, we will work on the final video and report.

Schedule

On schedule.

Jerry’s Status Update for 04/25 (Week 11)

Progress

This week, I worked on the final version of the point model. To get the best performance, I included noise into the pointing dataset, used less keypoints to focus on the arms, and added location as a feature for point classification.

The best performing model on the newest dataset had a validation accuracy of 96%.

I also built a point trigger to let users choose where to activate the point in the final application. This allows the point to operate smoothly with the gesture recognition system at ~27 FPS, whereas running the point model on every frame is 28FPS.

We also planned what we needed for our demo video.

Deliverables next week

Next week, we will give our final presentation and put together our video.

Schedule

On schedule.

Sean’s Status Update for 04/25 (Week 11)

Progress

This week, I focused mainly on producing presentation/demo material. I recorded couple videos of the robot performing path-finding as well as driving to the desired location. The project is more or less complete, and I am cleaning up the codes I wrote so it is more readable to anyone who sees the code for the first time.

Deliverables next week

Next week, I plan to record more video for the final presentation. In addition, I will work on the final report.

Schedule

On schedule.

Team Status Update for 04/18 (Week 10)

Progress

Path-finding

The algorithm is complete and ready for testing. It uses A* as mentioned before, and the robot is able to arrive at the grid-cell the goal point is located in. Testing is needed to check its robustness.

Pointing

Pointing to the room is almost done. It works well on one side of the room using a multitask regression model. It needs additional data to cover more parts of the room and for hyperparameter fine-tuning.

2D to 3D Mapping

Mapping now runs at 17 fps and data is formatted properly. The map needs to be integrated with point data.

Schedule

On schedule. Next week, we will put together a video for our demo.

Rama’s Status Update for 04/18 (Week 10)

Progress

I got speed up to 17 fps by not drawing the map; tkinter was a huge bottleneck. The data is formatted and I started planning integration to display point data on the map.

Deliverables next week

I will finish the integration with the point information before working on integration.

Schedule

On schedule.

Jerry’s Status Update for 04/18 (Week 10)

Progress

This week, I kept working on getting the point to work with a 3×5 bin larger room. With the new dataset, the multitask model was getting 0.94 validation x accuracy and 0.95 validation y accuracy, but the results were not smooth when running it on a test video and panning the point around the room.

So, I tried using a regression output rather than a categorical output, such that the model would be able to relate information about how close the bins are instead of treating the bins as independent. This gave a much smoother result, but still has issues in the panning video.

To address the smoothness in the testing videos, I added additional data of moving my arm around while pointing to a bin. This adds noise to the data and lets the model learn that even though I am moving my arm, I am still pointing to a bin. This worked well when I tried it on pointing to one side of the room. I will continue to try this method to cover more parts of the room.

I currently am manually evaluating the test videos, if I have time I will try to add labels to it so I can quantatitively test it.

Deliverables next week

Next week, I will have gotten the point to work on the entire room.

Schedule

On schedule.

Sean’s Status Update for 04/18 (Week 10)

Progress

Path-finding

Path-finding algorithm, along with the driving algorithm is pretty much complete. It is able to drive to the center of the cell the goal (x,y) position is located in. More testing would be necessary to see if there are any edge-cases, but the robot can now drive to the user/point given the xy-coordinate.

Deliverables next week

Next week, I plan to complete integrating the object/gesture detection work done by Jerry and Rama to test the functionalities. Hopefully, we will also have time to work on the video presentation and the report.

Schedule

On schedule.

Team Status Update for 04/11 (Week 9)

Progress

Point recognition

The multitask model works the best for the small 3×3 room point environment. I will work on collecting a dataset for the 5×5 larger point dataset next week.

Path finding

The path-finding algorithm is in development. It will be a variation of A* algorithm, using 8-connectivity grid representation of the room. With a robust implementation of path-finding, driving to user/robot will be fairly easily done. The implementation will be complete by next week.

2D to 3D Mapping

The mapping has been optimized as far as possible, and the inaccuracies were cleaned up.

Deliverables next week

Next week, we will continue working on our individual systems in preparation for integration.

Schedule

On schedule.

Rama’s Status Update for 04/11 (Week 9)

Progress

I finished the map visualization. Rendering map updates in real-time is really slow at around 1.5 fps after all of the optimizations I could use. The bottleneck is tkinter drawing the dots for the user and robot each frame, and there is nothing more I can do to speed that process up beyond what I have already tried. The inaccuracies were also fixed; I mostly needed to tighten the acceptable color range for the robot.

Deliverables next week

I will finalize the 2D to 3D mapping system and its outputs, and start work on the webserver integration in preparation for the final demo.

Schedule

On schedule.