This week I worked with Rohan on building the data labeling platform for Professor Dueck and designing the system for how to collect and filter the data. The Python program is specifically designed for Professor Dueck to annotate students’ focus states as ‘Focused,’ ‘Distracted,’ or ‘Neutral’ during music practice sessions. The platform efficiently records these labels alongside precise timestamps in both Epoch and conventional formats, ensuring compatibility with EEG headset data and ease of analysis across sessions. We also outlined the framework for integrating this labeled data with our machine learning model, focusing on how EEG inputs will be processed to predict focus states. This preparation is crucial for our next steps: refining the model to accurately interpret EEG signals and provide meaningful insights into enhancing focus and productivity.
Additionally, I worked on integrating a webcam feed into our application. I developed a component named WebcamStream.js. This script prioritizes connecting with an external camera device, if available, before defaulting to the computer’s built-in camera. Users can now view a real-time video feed of themselves directly within the app’s interface. Below is an image of the user when on the application. I will move this to the Calibration page this week.
My progress is on schedule and during the next week, I plan to integrate the webcam feed using MediaPipe instead so that we can directly extract the data on the application itself. I will also continue to work with Rohan on developing the machine learning model for the EEG headset and hopefully have one ready by the end of the week. In addition, I will continue to write code for all the pages in the application.