Noah’s Status Report for 12/7

Given that integration into our entire system went well, I spent the last week tidying things up and getting some user reviews. The following details some of the tasks I did this past week and over the break:

    • Added averaging of the detected emotions over the past 3 seconds
      • Decreases frequent buzzing in the braclet
      • Emotions are much more regulated and only change when the model is sufficiently confident.
    • Coordinated time with some friends to use our system and got user feedback
      • Tended to be pretty possible recognizing that this is only a first iteration
    • Started working on the poster and taking pictures of our final system.
      • Waiting on the bracelet’s completion, but it’s not a blocker.

List all unit tests and overall system test carried out for experimentation of the system. List any findings and design changes made from your analysis of test results and other data obtained from the experimentation.

*

I am done with my part of the project, and will mainly be working on the reports and such going forward.

Goals for next week:

  • Finish the poster, final report, and presentation!

Kapil’s Status Report for 12/7

1. What did you personally accomplish this week on the project?

  • I coded the entire Bluetooth Low Energy (BLE) functionality on the Adafruit Feather, ensuring it can communicate effectively as part of the system.
  • Mason and I began integrating BLE with the Jetson. We discovered that the Jetson does not natively support Bluetooth, but I resolved this issue by using a USB Bluetooth adapter. After configuring it, I successfully got BLE to work between the Jetson and the Adafruit Feather.
  • I obtained straps for the bracelet and 3D printed a prototype of the enclosure design. This prototype will house all the components and provide the structure needed for the final version.

2. Is your progress on schedule or behind?

  • My schedule is on track. BLE integration and the 3D printed prototype are significant milestones, and I am progressing as planned toward final assembly and testing.

3. What deliverables do you hope to complete in the next week?

  • Integrate all bracelet components inside the 3D printed enclosure. This will involve resoldering the connections to fit the prototype and ensuring everything is securely housed.
  • Glue the straps onto the 3D printed enclosure to finalize the wearable structure.

Kapil’s Status Report for 11/30

1. What did you personally accomplish this week on the project?

  • For the demo,  integrated the bracelet with the NVIDIA Jetson, ensuring that the entire setup functions as intended.
  • Inquired at TechSpark to gather more information about the 3D printing process and requirements
    • Base of the bracelet can be printed at TechSpark, however, the flexible straps require external printing.
  • Downloaded the new Bluetooth Low Energy libraries onto the Adafruit Feather
    • CircuitPython is not compatible with these libraries and requires an alternative method for pushing code. Exploring best approach to resolve this issue.
  • Tested and validated UART communication latency.
    • Measured latency was approximately 30ms, which is significantly below the 150ms requirement
  • Worked on the final presentation for the project

2. Is your progress on schedule or behind?

  • My progress is on schedule. The successful demo and latency validation ensure that the system is performing as required. I’ve also laid the groundwork for the remaining tasks, including Bluetooth integration and 3D printing.

3. What deliverables do you hope to complete in the next week?

  • Implement Bluetooth Low Energy (BLE) integration to replace UART communication, ensuring seamless wireless connectivity.
  • Begin printing the 3D enclosure, ensuring it securely houses all components while maintaining usability

Team’s Status Report 11/16

Team’s Status Report 11/16

1. What are the most significant risks that could jeopardize the success of the project?

  • Jetson Latency: Once the model hit our accuracy threshold of 70%, we had planned to continue doing improvement as time allowed. However, upon system integration, we have recognized that the Jetson cannot handle a larger model due to latency concerns. This is fine; however, does limit the potential of our project while running on this Jetson model.
  • Campus network connectivity: We identified that the Adafruit Feather connects reliably on most WiFi networks but encounters issues on CMU’s network due to its security. We still need to tackle this issue, as we are currently relying on a wired connection between the Jetson and braclet.

2. Were any changes made to the existing design of the system?

  • We have decided to remove surprise and disgust from the emotion detection model as replicating a disgusted or surprised face has proven to be too much of a challenge. We had considered this at the beginning of the project given that they emotions were the 2 least important; however, it was only with user testing that we recognized that they are too inaccurate.

3. Provide an updated schedule if changes have occurred:

We remain on schedule and were able to overcome any challenges allowing us to be ready for a successful demo.

Documentation:

  • We don’t have pictures of our full system running; however, this can be seen in our demos this week. We will include pictures next week when we have cleaned up the system and integration.

 

Model output coming from jetson to both Adafruit and Webserver:

Model output running on Jetson (shown via SSH):

Noah’s Status Report for 11/16

This week, I focused on getting the computer vision model running on the Jetson and integrating the webcam instead of my local computer’s native components. It went pretty well, and here is where I am at now.

  • Used SSH to load my model and the facial recognition model onto the Jetson
  • Configured the Jetson with a virtual environment that would allow my model to run. This was the complicated part of integration.
    • The Jetson’s native software is slightly old, so finding compatible packages is quite the challenge.
    • Sourced the appropriate Pytorch, Torchvision, and OpenCV packages
  • Transitioned the model to run on the Jetson GPUs
    • This requires a lot of configuration on the Jetson including downloading the correct CUDA drivers and ensuring compatibility with our chosen version of Pytorch
  • Worked on the output of the model so that it would send requests to both the web server and bracelet with proper formatting.

I am ahead of schedule, and my part of the project is done for the most part. I’ve hit MVP and will be making slight improvements where I see fit.

Goals for next week:

  • Calibrate the model such that it assumes a neutral state when it is not confident in the detected emotion.
  • Add averaging of the detected emotions over the past 3 seconds which should increase the confidence in our predictions.
  • Add an additional output to indicate if a face is even present.
  • Look into compiling the CV model onto onyx – a Jetson specific way of running models – so that there will be lower latency.

Kapil’s Status Report for 11/16

1. What did you personally accomplish this week on the project?

  • Received the NeoPixel and soldered header pins on.
  • Connected the NeoPixel to the Adafruit Feather.
    • Initially, it did not work, but after debugging a physical connection issue, I successfully resolved the problem and got it operational.
  • Attempted to connect Adafruit Feather to the WiFi
    • Unfortunately, due to persistent connectivity issues, I was unable to establish a connection. To work around this, I identified a program that allows me to push code to the Feather via a wired cable, enabling continued progress without relying on WiFi connectivity.
  • Attended two group meetings focusing on integrating the Jetson.
    • First meeting: encountered issues with securing an SSH connection
    • Second meeting: Successfully established the entire pipeline. The Jetson now communicates with both the iPad display and the embedded bracelet via UART, achieving the intended functionality for the demo.

2. Is your progress on schedule or behind?

  • My progress is on schedule. Despite the WiFi connectivity setbacks, I was able to get the NeoPixel working and successfully integrated the Jetson pipeline. The system is now demo-ready with UART communication functioning as intended.

3. What deliverables do you hope to complete in the next week?

  • Post-demo, my focus will shift to implementing Bluetooth Low Energy (BLE) to replace UART for wireless communication.
  • Conduct timing analysis to ensure the system meets its real-time performance requirements
  • Begin finalizing the 3D-printed enclosure and organizing the circuit for a more polished appearance.

4. Tests conducted and planned for verification and validation:

  • Verification Tests (Subsystem-level):
    • The embedded bracelet must achieve a latency of 150ms. As outlined in the Design Report, I will be using an oscilloscope for this. I will have two latencies one for UART (wired communication), and one for BLE (wireless)
      • testing BLE latency and comparing it to the UART baseline to ensure that BLE does not compromise system responsiveness
    • For the user testing portion, I have two goals
      • Feedback Recognition Accuracy: At least 90% of participants should correctly identify the type of feedback (haptic or visual) associated with specific emotions within 3 seconds of actuation.
      • Error Rate: The system must maintain an error rate of less than 10% for incorrectly signaling emotions, ensuring reliability.

Noah’s Status Report for 11/9

This week was mostly focused on getting prepared for integration of my CV component into the larger system as a whole. Here are some of the tasks we completed this week:

  • Spent some more time doing a randomized grid search to determine the best hyperparameters for our model.
    • Made the model slightly better up to 70% accuracy which was our goal; however, it doesn’t translate super well to real-time facial recognition.
    • Might need to work on some calibration or use a different dataset
  • Conducted research on the capabilities of the Jetson we chose and how to load our model onto that Jetson so that it would properly utilize the GPU’s
    • I have a pretty good idea of how this is done and will work on it this week once we are done configuring the SSH on the Jetson.

I am on schedule and ready to continue working on system integration this upcoming week!

Goals for next week:

  • Start testing my model using a different dataset which closer mimics the resolution we can get from our webcam.
    • If this isn’t promising, we might advice adding some calibration to each individual person’s face.
  • Download the model to the Jetson and use the webcam / Jetson to run the model instead of my computer
    • This will fulfill my portion of the system integration
    • I would like to transmit the outfit from my model to Mason’s website in order to ensure that we are getting reasonable metrics.

 

Kapil’s Status Report for 11/9

1. What did you personally accomplish this week on the project?

  • This week, I confirmed that the NeoPixel part I ordered last week is on its way. Once it arrives, I plan to finalize and complete the circuit assembly.
  • In the meantime, I shifted my focus to integrating the components with the Jetson, which is an essential step for the full system setup. To facilitate this, I participated in a group meeting where we worked on setting up the Jetson and coordinating its communication with other system components.
  • During integration, I identified an issue with the Adafruit Feather’s connectivity. While it works smoothly on my local WiFi, it faces challenges when trying to connect to CMU’s network. This is a critical insight that will need addressing to ensure reliable operation in our intended environment.

2. Is your progress on schedule or behind?

  • My progress is slightly behind schedule.  I wanted to have finished my circuit and start working on  streamlining the design but am still waiting for the NeoPixel to arrive.

3. What deliverables do you hope to complete in the next week?

  • Next week, I hope to have the NeoPixel part in hand and complete the circuit assembly.
  • I aim to continue working on and ideally finalize the integration between the Adafruit Feather and the Jetson, addressing the WiFi connectivity issue to ensure seamless communication.
  • If the circuit is completed, I will run initial tests to confirm that the system operates as expected and adjust any configurations as necessary.

Team’s Status Report 11/2

1. What are the most significant risks that could jeopardize the success of the project?

  • Power and Component Integration: For the embedded bracelet, we identified a power issue with the NeoPixel which caused unexpected behavior. Although we ordered a part to address this, any further power discrepancies could delay testing.
  • Integration with Jetson We have not received our NVIDIA Jetson yet and still need to complete integration between the three subsystems.

2. Were any changes made to the existing design of the system?

  • Power Adjustment for NeoPixel: To resolve the NeoPixel’s voltage requirements, we ordered a new component, modifying the power design for stable operation.
  • Real-time API and UI Enhancements: We designed a REST API for real-time data transmission and updates, which adds a dynamic element to the website interface. Upcoming UI enhancements, including a timer and confidence bar, will improve the usability and clarity of the feedback display.

3. Provide an updated schedule if changes have occurred:

  • The team remains on schedule with component integration. Kapil will continue with the 3D printed enclosure and communication setup while awaiting the NeoPixel component. Noah will transition to Jetson-based model testing and optimization. Mason’s API testing and UI improvements will align with the next integration phase, ensuring that system components function cohesively.

Photos and Documentation:

  • Initial bracelet circuit, highlighting successful haptic motor tests and NeoPixel power issues.

  • Images of the real-time labeling function, which demonstrates the model’s current performance and areas needing improvement.

Noah’s Status Report for 11/2

I wanted to fully integrate all the computer vision components this week which was mostly a success! Here are some of the following tasks I completed:

  • Continued to create and test my new facial recognition model
    • This has been kind of a side project to increase the accuracy of the emotion detector in the long run.
    • It shows significant benefits, but I’ve decided to hold off for now until we can start with system integration
  • Trained a new model with over 200 epochs and 5 convolution layers
    • This model reaches an accuracy of 68.8% which is so close to the threshold we set out in the design report.
    • I believe we can make it to 70%, but I’ll need to dedicate an instance to training for over 6 hours likely.
      • Also need to reduce overfitting again
  • Integrated the model to produce labels in real-time
    • I’ve included some pictures below to show that it is working!
    • It has a noticeable bias to say that you are sad which is a bit of an issue
      • I’m still working to discover why this is.

Being mostly done with the model, I am on schedule and ready to work on system integration this upcoming week!

Goals for next week:

  • Continue to update the hyperparameters of the model to obtain the best accuracy as possible.
  • Download the model to the Jetson and use the webcam / Jetson to run the model instead of my computer
    • This will fulfill my portion of the system integration
    • I would like to transmit the outfit from my model to Mason’s website in order to ensure that we are getting reasonable metrics.