Mason’s Status Report for 12/7

I spent the majority of this week working on the Kalman filter for our model output, conducting tests, and working on our upcoming deadlines. Here is what I did this past week:

    • Made and integrated Kalman filter
      • I worked independently on writing Kalman filter code for the model output
      • The goal of the system is to reduce noise associated with model output
      • The system achieves the desired effect of reducing unnecessary state transitions and reducing noise
      • Still needs to be tuned for video and final demo
    • Set up and tested system in demo room
      • Made sure that the jetson will be able to set up on ethernet in our demo room
    • Began work on poster and video
      • Added poster components related to the web app subsystem
    • Conducted user tests with 3 people
      • Had them use the system and give feedback on the UI and system usabiliity/practicality
    • Worked with Kapil on Jetson BLE integration. Researched a few options and ultimately found a suitable usb dongle for Bluetooth.

Goals for next week:

  • Tune Kalman filter
  • Further prepare system for video and demo
  • Finish final deadlines and objectives

Mason’s Status Report for 11/30

This week, I focused on preparation for the final presentation, testing, and output control ideation. Here’s what I accomplished:

This week’s progress:

  • Testing
    • Ran tests on API latency using python test automation.  
    • Found that the API speed is well within our latency requirement (in good network conditions).
    • Ran latency tests on the integrated system, capturing the time between image capture and system output. 
    • Found that our overall system is also well within our latency requirement of 350 ms, in fact operating consistently under 200 ms. 
    • Wrote out user testing feedback form for user testing next week.
  • Final presentation
    • Worked on our final presentation, particularly the testing, tradeoff, and system solution sections.
    • Wrote notes and rehearsed the presentation in lead up to class.
  • Control, confidence score Ideation
    • In an effort to enhance our systems confidence score efficacy, I decided to integrate a control system.
    • I plan to use a Kalman filter to regulate the display of the system output in order for account for the noise present in the system output.
    • By using the model output probabilistic weights, I will try to analyze the output and make a noise adjusted likelihood estimate.
    • I will implement this with numpy on the Jetson side of the system and update the API in conjuction with this.

Goals for Next Week:

  • Implement kalman filter on Jetson.
  • Assist Kapil with BLE Jetson integration for the Adafruit bracelet.
  • Continue user testing and integrate changes into UI in conjunction with feedback. 
  • Work on poster and finalize system presentation for final demo.

I’d say we’re on track for final demo and final assignments.

As you’ve designed, implemented and debugged your project, what new tools or new knowledge did you find it necessary to learn to be able to accomplish these tasks? What learning strategies did you use to acquire this new knowledge?

For the web app development, I’ve had to learn better how to optimize UIs for fast latency times, reading the Jquery Docs to figure out how to best integrate AJAX to dynamically update the system output. I had to make a lightweight interface, and build an API that can run fast enough to have real time feedback, for which I read the requests (HTTP for humans) docs. I read some forum postings on fast APIs in python, as well as watched a couple development videos on people building fast apis for their own applications in python. For the system integration and Jetson configuration, I watched multiple videos on flashing the OS, and read forum posts and docs from NVIDIA. I also consulted LLMs on how to handle bugs with our system integrations and communication protocols. The main strategies and technologies I used were forums (stack overflow, NVIDIA), Docs (Requests, jQuery, NVIDIA), and Youtube videos (independent content creators).

Images:

Mason’s Status Report for 11/16

This week, I focused on integration tasks to prepare the system for the demo. Here’s what I accomplished:

This week’s progress:

  • Jetson and Adafruit UART Integration
    • Worked with Kapil to implement UART communication between the Jetson and Adafruit
    • Helped write, run, and debug the code required to communicate via the Jetson’s UART pins.
    • Created and executed test scripts for this communication, eventually achieving functionality with the output of the model.
  • Model Deployment on Jetson
    • Resolved compatibility challenges related to leveraging the Jetson’s GPU for efficient model execution. 
    • Successfully installed the necessary packages and verified the model running in real-time using a webcam.
  • System Integration
    • Made changes to the model to integrate it with the API and UART communication, ensuring smooth output transmission.
    • Finalized the Jetson setup for the demo: the model now runs locally on the Jetson and transmits outputs as specified in the project write-up.

Goals for Next Week:

  • Collaborate with Kapil on Bluetooth integration between the Jetson and the bracelet/Adafruit.
  • Work with Noah to improve the model’s efficiency and reduce latency.
  • Conduct tests for API latency to ensure real-time responsiveness.
  • Begin user and user experience testing to evaluate the system’s performance and usability.

I’d say we’re on track and I feel good about the state of our project going into the demo on Monday.

UART (bracelet) working alongside API (iPad):

I forgot to capture photos of the model on Jetson/whole system in action.

Mason’s Status Report for 11/9

This week, I aimed to complete various components of our project, focusing on API integration and Jetson setup. Here’s a summary of the tasks I successfully completed:

  • Implemented and tested the website API
    • Finalized the API for real-time data transmission from the Jetson to the website, allowing dynamic updates.
  • Downloaded the NVIDIA SDK and flashed the OS on the Jetson
    • This setup is now complete, enabling us to move forward with device integration and testing.
  • Collaborated with Kapil on Bracelet and Jetson integration
    • We worked together to connect the Bracelet with the Jetson. We were able to get the Adafruit on the network so that the jetson will be able to communicate with it
  • Tested Jetson-to-Website integration
    • Ran scripts on the Jetson that called the API with various test values to validate the website’s response and observed successful updates reflecting test inputs.
  • Enhanced website with AJAX polling
    • Added AJAX polling to enable continuous updates from the API, so the website now dynamically reflects real-time changes in the data.

Overall, we are close to being on schedule, though there is still considerable work to do this week to stay on track for the upcoming demo.

Goals for Next Week:

  • Assist Noah with running the model on the Jetson
    • Work together to optimize the model setup and ensure it operates effectively on the Jetson platform.
  • Finalize the Jetson-Bracelet integration
    • Complete the necessary configurations and testing to enable seamless communication between the Jetson and the Bracelet.
  • Design and initiate latency tests for the system
    • Develop latency tests to measure and optimize response times across the integrated system, ensuring it meets performance requirements for the demo.

Mason’s Status Report for 11/2

Accomplishments: This week, I completed deployment using Gunicorn and Nginx, which is now set up to handle concurrent requests effectively and provides a stable environment for our application. The website is ready to be viewed on the ipad and used in the system. Additionally, I designed the Rest API that will connect to the model, facilitating data transmission from the Nvidia Jetson to the website. This api is made using the python requests package, and integrated into the Django web app. Through Ajax, the display can make real-time updates on the display. Note that while the API is designed and integrated into the webapp, but it hasn’t been tested yet.

Next Steps: On schedule, my goal next week is to integrate and run both the API and the facial recognition model directly on the Jetson. This milestone will be essential to finalize the whole system’s integration and confirm that all components work cohesively to deliver consistent, real-time data flow and processing. We will also begin testing the latency of the API to ensure fluid user experience, and consistent timing between the iPad and bracelet. From there, I will move to working on adding a watch/timer display to the interface to show time elapsed, and add a bar to show confidence. These UI enhancements will aid in the usability of the display.

Mason’s Status Report for 10/26

1. What did you personally accomplish this week on the project?

This week, I made substantial progress on our application by setting up an EC2 instance and deploying the application on it. This involved handling critical tasks like configuring the server, running application migrations, and integrating various components to ensure the system functions cohesively.

I also refined the application mocks, which are now available for review in the team report, and consolidated the app to be more lightweight and efficient to run. The application is almost fully operational, and I am close to making it accessible via a URL, enabling team members and future users to access it directly.

2. Is your progress on schedule or behind?

My progress is on schedule. Deploying the application to EC2 and reaching this stage without major issues is a significant milestone, and I am pleased with the current pace.

3. What deliverables do you hope to complete in the next week?

Next week, my primary focus will be A. getting the application fully accessible via web domain, and B. setting up the API call from the NVIDIA Jetson to the server, which will control the server’s output based on input from the Jetson device. Additionally, I plan to establish the Jetson’s TCP connection and enhance the web interface by incorporating features like bar charts to display confidence levels and a stopwatch to track time elapsed per detected emotion.

Mason’s Status Report

Mason’s Status Report for 10/20

This week, I focused on significant front-end enhancements for the EmotiSense web app. These updates improve user interaction and provide better feedback on emotion recognition results. Specific updates are:

  • Added individual views for each of the six emotions (happiness, sadness, surprise, anger, fear, and neutral), allowing users to see a dynamic visual representation for each detected emotion.
  • Implemented a display for the confidence score alongside the emotion, which helps users gauge the accuracy of the detection.
  • Created logic to track and display the time elapsed from the last emotion transition, offering insight into how quickly emotions change during a session.

Additionally, I spent time researching how to make API calls directly from the Nvidia Jetson Xavier AGX. This research focused on how the Jetson communicates with cloud services and how it handles real-time emotion data processing efficiently. Key learnings included optimizing the use of TCP/IP protocols and managing data transmission with minimal latency.

Is your progress on schedule or behind?

My progress is on schedule. The front-end enhancements have been successfully implemented, and I have made significant progress in understanding how to integrate the Nvidia Jetson for real-time data transmission.

What deliverables do you hope to complete in the next week?

  • Finalize API integration design to enable real-time data processing between the Nvidia Jetson Xavier AGX and the cloud-based web app.
  • Conduct latency testing to ensure the system can handle real-time emotion data efficiently.
  • Further enhance the user interface with responsive design elements and user feedback on system performance.

Mason’s Status Report for 09/28

I spend this week working on the development and construction of our web application. This involved migrating an existing application I had built in a previous class. I spent a lot of my time figuring out how to transform my previous application from my old version, and getting it work again. Here are some of the specific items I finished:

  • Coordinated the database migration with SQLite
    • Ran into errors associated with the database coordination and creation of new model structures
    • Removed old models and forms from application, worked on new post form
      • post form to be used in TCP communication with jetson
  • Inclusion of AJAX in application
  • Added to design presentation
    • Diagramed Website and AJAX coordination with Jetson
    • Outlined user testing for website experience
  • Gave input on model design considerations
  • Followed up on the 2 team meetings I was sick and missed
  • Looked into TCP communication with Jetson

Goals for next week:

  • Complete POST request model and AJAX coordination
  • Deploy website on AWS and purchase a domain
  • Begin prototyping UI and get feedback on UI from teammates

Mason’s Status Report for 9/21

The team’s focus this week was primarily delivering the presentation. We also made significant devisions about deadlines and goals for the coming weeks. Here’s some of what I worked on:

  • Making the proposal presentation and preparing the delivery
    • Researched industry standards for latency and speed with REST APIs
    • Researched wired communication speeds
    • Identified Model accuracy guidance
  • Met up with and brainstormed together with teammates
  • Build off of TA and instructor feedback
  • Gave detailed feedback on other presentations
  • Researched AJAX implementations

Upcoming goals for the week:

  • Finalize and order any hardware needed
  • Work on website, get early version running locally
  • Get more TA and professor feedback during the week
  • Identify our wired communication protocol from camera to Jetson