Our final video for CVStudio is below, thank you so much to Marios and Emily for a great semester despite the circumstances!
Chris’ Status Report 3/21
Progress Made
Over Spring break, I did not do any work on the project. After returning to Pittsburgh, most of my time was spent on trying to get my life in order so not too much time was spent on the project, other than organizing the data that we collected before break.
Work for this week
I hope to be able to get the LSTM classifier working with reasonable accuracy on the Xavier board by the end of the week. I will work with OpenCV to do hand detection then running the classifier on the images of the hands for these gestures.
Schedule
Currently, I am behind on my schedule by about two weeks. This hopefully will be made up by the fact that I am spending most of my time at home and also that we have a decent amount of slack built into our schedule.
Plan moving forward
I have uploaded my 1 pager explaining what I plan on working on for the rest of the semester.
Team Status Report 3/21
Progress
We have collected all of the data that we should need for the training and classification of both music creation gestures as well as editing gestures. We collected about 50 samples of each gesture, with multiple people performing these gestures.
Work for this week
We will be working during class time on each of our sections with a Zoom meeting going as our “face to face” time. As travel plans to return are uncertain for Chris as well as Mark currently, we will try to do as much as we can in Pittsburgh. Chris will hand off the Xavier Board to Tony soon.
Chris and Tony will work on their respective classifiers and try to get them to work on the Xavier board hopefully by the week’s end with reasonable accuracy.
Mark will continue to be working on the glove and on the Raspberry Pi.
Schedule
We are still behind schedule trying to figure out how to coordinate our work as well as needing to work on building the classification for the gestures. We still have a decent amount of slack for the rest of the semester (hopefully integration will not take too long) that we could use to make up for the lost time. However, it will be difficult to work as quickly as when we are all together so we may be losing a bit of time there as well.
Chris’ Status Report for 2/29
Progress:
I have been able to set up OpenPose on the Xavier Board and it seems to run around 17 fps. Still working on the action recognition algorithm. I was working on trying to test our current algorithms for clap and stomp as well and try to refine the thresholds we are using. Furthermore, I spent lots of time working on the design document which took a bit longer than I hoped.
Schedule:
I believe that I have fallen about a week behind the original schedule with other classes taking up a bit more time than I anticipated this week as well as working on the design document taking a little more time than we had originally planned on the schedule. I should be able to make this up using the first planned slack time during the end of Spring break when I won’t have any other work.
Team Report For 2/22
Chris/Tony
- Created simple classifier for stomps and claps
- Working on refining and removing false negatives, and improving the timing
For next week: create an action classifier using our research to do start and stop recognition
Most promising so far: https://github.com/felixchenfy/Realtime-Action-Recognition
Mark
Finalized glove design this week. Created system diagram and a bill of materials to send for order. Went through a couple iterations during design phase which is why the parts order has not been placed yet.
All:
Currently on schedule, however, we may have to use more of our slack to do more in the action classifier
Chris’ Status Report for 2/22
This week I implemented a way to detect stomps through our OpenPose processed video of just a stomp.
Currently, however, I also processed a video with Tony just flailing his arms (should be no detection) and it still did detect it as a stomp if his legs moved at all, even when it was clearly not a stomp. Working to fix by adding more parts of the body into the classification of a stomp.
For this week, we were given AWS credit that we wanted to use to process video on so we could process longer video and test on that without having to process on our own machines
I got it partially working, but there were some issues with compatibility with our EC2 Instance so I am continuing to work on that
Chris’ Status Update 02/15
Progress
I was able to get OpenPose running on the computer and I started to work on the gesture recognition using pose data from a video that was processed using OpenPose. I’m devising a way to use the pose of the hands to recognize a large clap.
I’m also looking into different action recognition “libraries” (other github projects) to do action recognition on a play pause wave.
I’ve also been able to get the Kinect working with OpenPose using libfreenect2.
Deliverables next week
Hopefully by the end of this week I’ll be able to get some basic gesture recognition using distance and frame numbers. I want to be able to recognize claps and stomps. This will still be on processed video, and when we get the Xavier Board we will be able to see if these work in real time, even with high latency.
Schedule
I think that I’m still on schedule, although I think that using machine learning classification to recognize claps and stomps will be a little more challenging than I initially thought.