Mason’s Status Report for 12/7

I spent the majority of this week working on the Kalman filter for our model output, conducting tests, and working on our upcoming deadlines. Here is what I did this past week:

    • Made and integrated Kalman filter
      • I worked independently on writing Kalman filter code for the model output
      • The goal of the system is to reduce noise associated with model output
      • The system achieves the desired effect of reducing unnecessary state transitions and reducing noise
      • Still needs to be tuned for video and final demo
    • Set up and tested system in demo room
      • Made sure that the jetson will be able to set up on ethernet in our demo room
    • Began work on poster and video
      • Added poster components related to the web app subsystem
    • Conducted user tests with 3 people
      • Had them use the system and give feedback on the UI and system usabiliity/practicality
    • Worked with Kapil on Jetson BLE integration. Researched a few options and ultimately found a suitable usb dongle for Bluetooth.

Goals for next week:

  • Tune Kalman filter
  • Further prepare system for video and demo
  • Finish final deadlines and objectives

Mason’s Status Report for 11/30

This week, I focused on preparation for the final presentation, testing, and output control ideation. Here’s what I accomplished:

This week’s progress:

  • Testing
    • Ran tests on API latency using python test automation.  
    • Found that the API speed is well within our latency requirement (in good network conditions).
    • Ran latency tests on the integrated system, capturing the time between image capture and system output. 
    • Found that our overall system is also well within our latency requirement of 350 ms, in fact operating consistently under 200 ms. 
    • Wrote out user testing feedback form for user testing next week.
  • Final presentation
    • Worked on our final presentation, particularly the testing, tradeoff, and system solution sections.
    • Wrote notes and rehearsed the presentation in lead up to class.
  • Control, confidence score Ideation
    • In an effort to enhance our systems confidence score efficacy, I decided to integrate a control system.
    • I plan to use a Kalman filter to regulate the display of the system output in order for account for the noise present in the system output.
    • By using the model output probabilistic weights, I will try to analyze the output and make a noise adjusted likelihood estimate.
    • I will implement this with numpy on the Jetson side of the system and update the API in conjuction with this.

Goals for Next Week:

  • Implement kalman filter on Jetson.
  • Assist Kapil with BLE Jetson integration for the Adafruit bracelet.
  • Continue user testing and integrate changes into UI in conjunction with feedback. 
  • Work on poster and finalize system presentation for final demo.

I’d say we’re on track for final demo and final assignments.

As you’ve designed, implemented and debugged your project, what new tools or new knowledge did you find it necessary to learn to be able to accomplish these tasks? What learning strategies did you use to acquire this new knowledge?

For the web app development, I’ve had to learn better how to optimize UIs for fast latency times, reading the Jquery Docs to figure out how to best integrate AJAX to dynamically update the system output. I had to make a lightweight interface, and build an API that can run fast enough to have real time feedback, for which I read the requests (HTTP for humans) docs. I read some forum postings on fast APIs in python, as well as watched a couple development videos on people building fast apis for their own applications in python. For the system integration and Jetson configuration, I watched multiple videos on flashing the OS, and read forum posts and docs from NVIDIA. I also consulted LLMs on how to handle bugs with our system integrations and communication protocols. The main strategies and technologies I used were forums (stack overflow, NVIDIA), Docs (Requests, jQuery, NVIDIA), and Youtube videos (independent content creators).

Images:

Team Status Report 11/30

  1. What are the most significant risks that could jeopardize the success of the project?

BLE Integration: Transitioning from UART to BLE has been challenging. While UART meets latency requirements and works well for now, our BLE implementation has revealed issues with maintaining a stable connection. Finishing this in time for the demo is critical to reach our fully fledged system.

  • Limited User Testing: Due to time constraints, the scope of user testing has been narrower than anticipated. This limits the amount of user feedback we can incorporate before the final demo. However, we are fairly satisfied with the system given the feedback we have received so far. The bracelet will be a major component of this though.
  1. Were any changes made to the existing design of the system?
  • We planned to add a Kalman filter to smooth the emotion detection output and improve the reliability of the confidence score. This will reduce the impact of noise and fluctuations in the model’s predictions. If this does not work we will instead use a rolling average
  • Updated the web interface to indicate when no face is detected, improving user experience and accuracy of the output.
  1. Updated schedule if changes have occurred:

We remain on schedule outside of bracelet delays and have addressed challenges effectively to ensure a successful demo. Final user testing and BLE integration are prioritized for the upcoming week.

Testing results:

 

Model output from Jetson to both Adafruit and Webserver:

Mason’s Status Report for 11/16

This week, I focused on integration tasks to prepare the system for the demo. Here’s what I accomplished:

This week’s progress:

  • Jetson and Adafruit UART Integration
    • Worked with Kapil to implement UART communication between the Jetson and Adafruit
    • Helped write, run, and debug the code required to communicate via the Jetson’s UART pins.
    • Created and executed test scripts for this communication, eventually achieving functionality with the output of the model.
  • Model Deployment on Jetson
    • Resolved compatibility challenges related to leveraging the Jetson’s GPU for efficient model execution. 
    • Successfully installed the necessary packages and verified the model running in real-time using a webcam.
  • System Integration
    • Made changes to the model to integrate it with the API and UART communication, ensuring smooth output transmission.
    • Finalized the Jetson setup for the demo: the model now runs locally on the Jetson and transmits outputs as specified in the project write-up.

Goals for Next Week:

  • Collaborate with Kapil on Bluetooth integration between the Jetson and the bracelet/Adafruit.
  • Work with Noah to improve the model’s efficiency and reduce latency.
  • Conduct tests for API latency to ensure real-time responsiveness.
  • Begin user and user experience testing to evaluate the system’s performance and usability.

I’d say we’re on track and I feel good about the state of our project going into the demo on Monday.

UART (bracelet) working alongside API (iPad):

I forgot to capture photos of the model on Jetson/whole system in action.

Team’s Status Report 11/9

Team’s Status Report 11/9

1. What are the most significant risks that could jeopardize the success of the project?

  • Model Accuracy: While the model achieved 70% accuracy, meeting our threshold, real-time performance with facial recognition is inconsistent. It may require additional training data to improve reliability with variable lighting conditions.
  • Campus network connectivity: We identified that the Adafruit Feather connects reliably on most WiFi networks but encounters issues on CMU’s network due to its security. We will need to tackle this in order to get the Jetson and Adafruit communication working.

2. Were any changes made to the existing design of the system?

  • AJAX Polling and API inclusion for Real-time Updates: We implemented AJAX polling on the website, allowing for continuous updates from the Jetson API. This feature significantly enhances user experience by dynamically displaying real-time data on the website interface.
  • Jetson Ethernet: We have decided to use ethernet for the jetson to connect it to the cmu network.

3. Provide an updated schedule if changes have occurred:

The team is close to being on schedule, though some areas require focused attention this week to stay on track for the upcoming demo. We need to get everything up and running in order to transition fully to testing and enhancement following the demo.

Photos and Documentation:

  • Jetson-to-website integration showcasing successful data transmission and dynamic updates. emotisense@ubuntu is the ssh into the Jetson. The top right terminal is the jetson, the left and bottom right are the website UI and  request history respectively. I also have a video of running a test file on the jetson but video embedded doesn’t work for this post.

Mason’s Status Report for 11/9

This week, I aimed to complete various components of our project, focusing on API integration and Jetson setup. Here’s a summary of the tasks I successfully completed:

  • Implemented and tested the website API
    • Finalized the API for real-time data transmission from the Jetson to the website, allowing dynamic updates.
  • Downloaded the NVIDIA SDK and flashed the OS on the Jetson
    • This setup is now complete, enabling us to move forward with device integration and testing.
  • Collaborated with Kapil on Bracelet and Jetson integration
    • We worked together to connect the Bracelet with the Jetson. We were able to get the Adafruit on the network so that the jetson will be able to communicate with it
  • Tested Jetson-to-Website integration
    • Ran scripts on the Jetson that called the API with various test values to validate the website’s response and observed successful updates reflecting test inputs.
  • Enhanced website with AJAX polling
    • Added AJAX polling to enable continuous updates from the API, so the website now dynamically reflects real-time changes in the data.

Overall, we are close to being on schedule, though there is still considerable work to do this week to stay on track for the upcoming demo.

Goals for Next Week:

  • Assist Noah with running the model on the Jetson
    • Work together to optimize the model setup and ensure it operates effectively on the Jetson platform.
  • Finalize the Jetson-Bracelet integration
    • Complete the necessary configurations and testing to enable seamless communication between the Jetson and the Bracelet.
  • Design and initiate latency tests for the system
    • Develop latency tests to measure and optimize response times across the integrated system, ensuring it meets performance requirements for the demo.

Mason’s Status Report for 11/2

Accomplishments: This week, I completed deployment using Gunicorn and Nginx, which is now set up to handle concurrent requests effectively and provides a stable environment for our application. The website is ready to be viewed on the ipad and used in the system. Additionally, I designed the Rest API that will connect to the model, facilitating data transmission from the Nvidia Jetson to the website. This api is made using the python requests package, and integrated into the Django web app. Through Ajax, the display can make real-time updates on the display. Note that while the API is designed and integrated into the webapp, but it hasn’t been tested yet.

Next Steps: On schedule, my goal next week is to integrate and run both the API and the facial recognition model directly on the Jetson. This milestone will be essential to finalize the whole system’s integration and confirm that all components work cohesively to deliver consistent, real-time data flow and processing. We will also begin testing the latency of the API to ensure fluid user experience, and consistent timing between the iPad and bracelet. From there, I will move to working on adding a watch/timer display to the interface to show time elapsed, and add a bar to show confidence. These UI enhancements will aid in the usability of the display.

Mason’s Status Report for 10/26

1. What did you personally accomplish this week on the project?

This week, I made substantial progress on our application by setting up an EC2 instance and deploying the application on it. This involved handling critical tasks like configuring the server, running application migrations, and integrating various components to ensure the system functions cohesively.

I also refined the application mocks, which are now available for review in the team report, and consolidated the app to be more lightweight and efficient to run. The application is almost fully operational, and I am close to making it accessible via a URL, enabling team members and future users to access it directly.

2. Is your progress on schedule or behind?

My progress is on schedule. Deploying the application to EC2 and reaching this stage without major issues is a significant milestone, and I am pleased with the current pace.

3. What deliverables do you hope to complete in the next week?

Next week, my primary focus will be A. getting the application fully accessible via web domain, and B. setting up the API call from the NVIDIA Jetson to the server, which will control the server’s output based on input from the Jetson device. Additionally, I plan to establish the Jetson’s TCP connection and enhance the web interface by incorporating features like bar charts to display confidence levels and a stopwatch to track time elapsed per detected emotion.

Mason’s Status Report

Mason’s Status Report for 10/20

This week, I focused on significant front-end enhancements for the EmotiSense web app. These updates improve user interaction and provide better feedback on emotion recognition results. Specific updates are:

  • Added individual views for each of the six emotions (happiness, sadness, surprise, anger, fear, and neutral), allowing users to see a dynamic visual representation for each detected emotion.
  • Implemented a display for the confidence score alongside the emotion, which helps users gauge the accuracy of the detection.
  • Created logic to track and display the time elapsed from the last emotion transition, offering insight into how quickly emotions change during a session.

Additionally, I spent time researching how to make API calls directly from the Nvidia Jetson Xavier AGX. This research focused on how the Jetson communicates with cloud services and how it handles real-time emotion data processing efficiently. Key learnings included optimizing the use of TCP/IP protocols and managing data transmission with minimal latency.

Is your progress on schedule or behind?

My progress is on schedule. The front-end enhancements have been successfully implemented, and I have made significant progress in understanding how to integrate the Nvidia Jetson for real-time data transmission.

What deliverables do you hope to complete in the next week?

  • Finalize API integration design to enable real-time data processing between the Nvidia Jetson Xavier AGX and the cloud-based web app.
  • Conduct latency testing to ensure the system can handle real-time emotion data efficiently.
  • Further enhance the user interface with responsive design elements and user feedback on system performance.

Mason’s Status Report for 10/5

Before the start of break I hope to have the website deployed so I can focus on the UI design and Jetson configuration afterward. This week I spent time working on the web app’s front page and user system. I honestly didn’t accomplish a ton this week, as I had a very busy schedule with other classes and assignments. I’m on track for finishing the deployment this week and being able to switch over my work to focus on user experience after break.

Specific tasks completed:

  • Worked on user model for database.
  • Added basic UI elements for output display
  • Worked on rest API format integration with AJAX

Tasks for this week/after break:

  • TCP Jetson Communication
  • EC2 Deployment
  • Frontend Enhancement
  • Latency testing and user experience testing.