Team Status Report for 03/18/23

What are the most significant risks that could jeopardize the success of the project? How are these risks being managed? What contingency plans are ready?

The most significant risk right now remains the same — if the windowed time series analysis turns out to be inaccurate. As described last week, if this doesn’t work, we plan to incorporate more rigorous calibration into the system or possibly restructure the gestures so that they are more distinct from each other. Additionally, we could resort to the neural network approach, although this is the less ideal case because it will likely have a performance penalty and may not necessarily have better accuracy.

Were any changes made to the existing design of the system (requirements, block diagram, system spec, etc)? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward?

None since last week.

Provide an updated schedule if changes have occurred

Team work adjustments

No teamwork adjustments have been made. We are currently on track to accomplish our tasks and are currently all just working on them.

David’s Status Report for 3/18/2023

What did you personally accomplish this week on the project? Give files orphotos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

This week, I spent my time writing the peripheral HID controller code and getting it to work on the Raspberry Pi.  I also wrote a script in Python that tests every data type in the HID report. After writing the code, I tested this system on the Pi with the install script. I had to test two things: (1) if the Pi connection through the USB-C port and the power splitter was working as intended, and (2) if the Pi is recognized as a peripheral device.

When I installed the drivers and the kernel configuration with configfs with the install script, I found that the connection through the USB-C port was working as intended. Plugging the Pi to my laptop caused it to show up in Windows (in Parallels) as a USB HID device. However, it initially did not work and failed to start due to a setup error (error 10) as seen in Windows Device Manager. I was able to trace the problem to the install script. In order to easily debug the HID report descriptor, I wrote it in a Python file that is relatively easy to read. Unfortunately, I did not notice that the Python file was being called in the setup script without it being present in the same folder. To fix this, I hard-coded the report descriptor value result from the Python script into the setup script. After I made this change, the Pi was successfully recognized as an HID gamepad.

Here is a demo of Windows receiving the test inputs from my test script. The test script exercises some positive and negative values for each data axis.

https://drive.google.com/file/d/1EWyLNByB8uvyoV85uuTF6MlHb0v3GAD7/view?usp=sharing

Next week, I will work on getting the macros working on Windows from a Python program. Additionally, I should also integrate the setup module with the HID controls.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I am on schedule. See Gantt Chart below

As you’ve now established a set of sub-systems necessary to implement your project, what new tools have your team determined will be necessary for you to learn to be able to accomplish these tasks?

Since the next part is figuring out how to control the cursor and inputs on Windows, I have to learn how to do that from Python. In addition, I have to learn how to use HIDAPI on Windows to parse the HID input.

Gram’s Status Report for 03/18/23

What did you personally accomplish this week on the project? Give files orphotos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

This past week, I wrote code for the controller to collect and aggregate data published by the glove on MQTT. I also worked on the gesture classification model, which took some more time. I built a decision tree which analyzed the trends of the incoming data in each window to classify what gesture was being performed. Although I was not able to implement smoothing of the data and more seamless transitions between windows of sensor data, I was able to write tests for the various gestures to ensure that as the models change, we can quickly verify the correctness of the underlying logic.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

Progress is slightly behind schedule on my part because of the decision tree, so I will try to finish this before the middle of next week so I can finish all of next week’s tasks on time.

What deliverables do you hope to complete in the next week?

Next week, I plan to fully complete the code for the decision tree in terms of smoothening out recognition over adjacent windows of data. Moreover, I will establish bounds (min/max) for gesture intensity output so I can map them to the intensity of the input data.

I will also work on establishing a rapid data collection script so we can create a dataset that we can use to analyze future models without doing live testing.

Xuan’s Status Report for 03/18/23

What did you personally accomplish this week on the project? Give files orphotos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

I finished the MQTT portion for the code of the microcontroller. There was a decision made to hold 10 data packets on the glove side before transmission in order to reduce network traffic, as the minimum time interval in reading the IMU data is around 10ms. This might change with further testing though. I also started with the desktop application that we’re using for the configuration and calibration, which would be written using the Qt library.

Currently, it has the option to switch between left and right-hand mode, but the calibration process isn’t written yet as I think the compute module would need to be done before that. I’m also finishing up breadboarding the glove circuit.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I think we’re on track.

What deliverables do you hope to complete in the next week?

I hope to further test using the breadboard circuit and integrate it with the MQTT portion of the compute module to see if there is a need to further optimize the network latency.

As you’ve now established a set of sub-systems necessary to implement your project, what new tools have your team determined will be necessary for you to learn to be able to accomplish these tasks?

I’m not necessarily learning anything new, but probably would need to sharpen up my soldering skills. I also probably would need to sharpen my skills in Qt.

Xuan’s Status Report for 03/11/23

What did you personally accomplish this week on the project? Give files orphotos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

A large portion of the week was spent working on the design report. It actually was easier than I initially expected due to the preparation we did from the entire design iteration process, as we had most of our ideas down when we did the slides.

I also started writing the system code for the microcontroller of the glove module. Specifically, I started writing code that would read sensor data from the various pins and started implementing an MQTT client that would pack and transmit this packet over a network.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I think we’re on track as we adjusted our schedule last week.

What deliverables do you hope to complete in the next week?

I hope to finish the main code of this glove microcontroller and assemble a basic breadboard representation of the glove module. This is to ensure everything will be working before soldering any component onto the glove.

As you’ve now established a set of sub-systems necessary to implement your project, what new tools have your team determined will be necessary for you to learn to be able to accomplish these tasks?

I’m not necessarily learning anything new, but probably would need to sharpen up my soldering skills.

Team Status Report for 03/11/23

What are the most significant risks that could jeopardize the success of the project? How are these risks being managed? What contingency plans are ready?

The most significant risk right now is if the windowed time series analysis turns out to be inaccurate. In case this doesn’t work, we plan to incorporate more rigorous calibration into the system or possibly restructure the gestures so that they are more distinct from each other. Additionally, we could resort to the neural network approach, although this is the less ideal case because it will likely have a performance penalty and may not necessarily have better accuracy.

Were any changes made to the existing design of the system (requirements, block diagram, system spec, etc)? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward?

None since last week.

Provide an updated schedule if changes have occurred

None

Team work adjustments

No teamwork adjustments have been made. We are currently on track to accomplish our tasks, and are currently all just working on them.

Gram’s Status Report for 03/11/23

What did you personally accomplish this week on the project? Give files orphotos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

The past week, I worked on writing the design report. Specifically, I worked on the Introduction, Use Case Requirements, parts of the Design Trade Studies, System Implementation, and Project Management sections, Related Work, and Summary.

I also began writing the controller code for the compute module. I setup the Mosquitto (MQTT) broker on the Raspberry Pi so that it can listen for incoming messages from a connected glove. I also wrote a Python program that subscribed to the data packets that would be broadcasted by the glove. Since the glove MQTT client is still in progress, I tested this functionality by writing a dummy MQTT client in Python to publish messages.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

Progress is still on schedule.

What deliverables do you hope to complete in the next week?

I plan to finish writing the controller code this week. I will need to write the logic for aggregating received data packets into windows that will be fed into the decision tree. Depending on how quickly the system can process incoming packets, I may also need to implement some sort of buffer queue to handle the case of data packets arriving faster than the system can process them.

Furthermore, I will also finish implementing the decision tree. I will try to implement the logic for classifying what gesture a user is doing without necessarily correctly identifying the degree/intensity of the action yet. If the decision tree logic is finished quickly, I hope to implement the degree/intensity determination logic as well.

As you’ve now established a set of sub-systems necessary to implement your project, what new tools have your team determined will be necessary for you to learn to be able to accomplish these tasks?

I will need to learn more about windowed analysis algorithms over time series data because I need to identify what optimal window size I should use. Moreover, I will likely need to learn more about data smoothing algorithms to prevent jittering. Furthermore, I will need to learn how to effectively remove the gravity vector from the accelerometer data. This will be challenging since the gravity vector may be split up across the three axes, depending on the orientation of the hand.

David’s Status Report for 3/11/2023

What did you personally accomplish this week on the project? Give files orphotos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

In the past 2 weeks, I worked on solidifying the plan for the entire project integration and also the HID section specifically. I thought about alternative ways to communicate to the host PC while making set up easy.

For instance, I was considering turning the controller into a mouse/keyboard hybrid device that will automatically control the mouse/keyboard inputs on the host PC, but the problem now is that the cursor has to always be in the center of the screen. One way to solve this issue is to send the cursor location as well as the screen area to the device so it can perform the necessary calculations. However, I thought this method would be a bit convoluted and makes many assumptions.

The solution I ended up choosing is to send only the motion data to the computer as if it were a video game controller. Then a Python application running on the PC would translate these into the appropriate commands. This way, we can build in the application-specific controls as well as mouse cursor and display tracking.

I also did more research into the process to implement the HID controller. It’s quite poorly documented, but thanks to some engineers and their blogs, I was able to figure out how it works.

First, I would have to configure the RPi to act as a USB peripheral device through the HID device configuration with configfs. Then, I would write the software that sends the appropriate data to the HID file descriptor. On the other hand, I would also have to test whether the RPi is correctly being detected as a gamepad on a Windows PC as well as the data it’s sending to the PC. Once this is done, I can start on the host PC software to parse and translate the data into commands.

blog post about HID and how it should be set up

configfs documentation

reference implementation for RPi HID (GitHub)

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

The project is currently on schedule. See Gantt Chart below

As you’ve now established a set of sub-systems necessary to implement your project, what new tools have your team determined will be necessary for you to learn to be able to accomplish these tasks?

We determined that I have to learn how to use the HID device specification and driver provided by the Linux kernel as well as the HIDAPI on Windows to accomplish the tasks.

Xuan’s Status Report for 2/25/2023

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).

We prepared the slides for the design presentation. We also started with the design report, although we didn’t make significant progress yet because the feedback came yesterday.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

We are slightly behind because our Gantt chart didn’t take into account the significance of the design review process and spring break. We updated the Gantt chart in response. Also, one of our parts was missing because the wrong part was shipped, so we put in a request for that. I think the original chart also had extra time that could be shortened, so we edited that part too.

What deliverables do you hope to complete in the next week?

I hope to finish the design report and start assembling the hardware glove module when the missing part arrives.

Team Status Report for 02/25/23

What are the most significant risks that could jeopardize the success of the project? How are these risks being managed? What contingency plans are ready?

The most significant risk that we have identified right now that could jeopardize the success of the project is if the machine learning turns out to be highly inaccurate or insufficiently performant. To manage this risk, we identified ways to perform the gesture detection using basic time series analysis and a decision tree which will be much faster because it doesn’t depend on a deep neural network and can even be implemented directly on the edge. This gesture detection would simply examine trends (increasing/decreasing) of the different sensor data and match them with the corresponding gestures we identified.

Were any changes made to the existing design of the system (requirements, block diagram, system spec, etc)? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward?

None since last week.

Provide an updated schedule if changes have occurred

The schedule was updated to account for the design report and to shorten the amount of time allocated for smaller tasks such as soldering the circuit.

Team work adjustments

We did not make any reassignments to our list of tasks yet since it currently seems well-balanced. However, we adjusted our schedule to make more of our tasks happen in parallel when they weren’t dependent on each other and also give us more time this week to focus on the design report which we did not originally account for in our schedule.