Team Status Report for 02/19/2022

We ran more trials with the Emotiv Insight and discovered that tongue movement comes up very distinctly in EEG output. Our EmotivePRO Student license and developer license were approved by ECE and Emotiv, so we now have the ability to obtain and analyze more information. This included more exploration of the Emotiv applications and their capabilities. We finalized input details for how user intentions and actions from EEG and EMG will be processed within our pipeline and transmitted to the user interface. Signal processing will be based on Python since the data from EMOTIV is sent directly through the EMOTIV API; therefore we will not use a third party application for signal processing that we planned to. Our current design will use ML to classify winking, tongue movement, and blinking from EEG sensing and use hard coded thresholds to classify left and right shoulder movement from EMG electrodes. On the software side we decided to use Flutter for app development and use wireless sockets to connect the front-end and back–end applications together. The preliminary user interface layout was designed.

 

Leave a Reply

Your email address will not be published. Required fields are marked *