Yasser’s Status Report for 04/08/23

Personally, I accomplished starting integrating the computer vision code into the jetson and started testing the computer vision code with the camera on the jetson. Bugs have risen in the integration which have to be fixed, specifically with mouth detection detecting certain frames where a user’s mouth is closed and eye tracking not always successfully tracking a user’s distracting when looking up.

According to the Gantt chart, I am not behind on progress.

In the next week, we hope to have the pose estimation classification done and working as well having the integration bugs of the cv code fixed and working by our next team’s meeting on Wednesday. After Wednesday, we hope to have started testing our device in a vehicle.

Tests that I plan on running are with eye tracking, having the user look left, right, up, and down with respect of the pupil location. With mouth detection, have the user yawn for a period of at least 5 seconds and see if the system correctly classifies this as a sign of drowsiness. With eye detection, have the user close their eye lids for more than 2 seconds and see if the system correctly classifies this as being distracted/drowsy. With respect to head pose, have the user turn their head far left, right, up and down for more than 2 seconds and correctly classify this as the driver being distracted. All of these tests will be done in a vehicle.

Leave a Reply

Your email address will not be published. Required fields are marked *