Tony’s Status Report for 3/14

Progress

Prior to leaving for spring break, I took training samples with Chris and Mark. We took at least 50 samples of each gesture, along with some negative look-alike examples to test the robustness of our algorithms.

Deliverables next week

Chris will give me the Xavier sometime before leaving Pittsburgh, and I want to be able to run the Kinect on the Xavier. We have also split up our responsibilities for the remainder of the semester, so my task is to develop the music creation gestures (stomp, clap, hit). I want to have algorithms for those that can achieve at least 70% accuracy on our training examples.

Schedule

I am still behind schedule, since I need to start the algorithms for the music creation gestures.

Mark’s Status Update for 2/29

Progress: this week I finalized my bill of materials for the glove component of our project and placed the first couple purchase orders.  I also started writing code that the Rpi will run to process button inputs.

My progress is on track so far.  I am planning on helping to assist my other team members in algorithm development if I have some extra time.

I would like to finish the code that reads GPIO inputs from the buttons this week and place the remaining purchase orders for the glove (glove itself, buttons).

Tony’s Status Report 2/29

Progress

I got the Kinect to work! Using libfreenect2, I could get a video stream working. However, libfreenect does not have joint detection features, so we will still have to use OpenPose or another library for that.

Deliverables next week

By next week I want to figure out how to use the Kinect with the Xavier and either do joint detection with OpenPose or another library.

Schedule

I am running more and more behind schedule. Figuring out how to use all the hardware is taking a while, which is preventing us from developing our action recognition algorithms.

Chris’ Status Report for 2/29

Progress:

I have been able to set up OpenPose on the Xavier Board and it seems to run around 17 fps. Still working on the action recognition algorithm. I was working on trying to test our current algorithms for clap and stomp as well and try to refine the thresholds we are using. Furthermore, I spent lots of time working on the design document which took a bit longer than I hoped.

Schedule:

I believe that I have fallen about a week behind the original schedule with other classes taking up a bit more time than I anticipated this week as well as working on the design document taking a little more time than we had originally planned on the schedule. I should be able to make this up using the first planned slack time during the end of Spring break when I won’t have any other work.

Tony’s Status Report for 2/22

Progress

I got a simple thresholding algorithm working to classify claps. I looked at OpenNI, which is a potential library we can use to interface with the Kinect on MacOSX (the default library is only for Windows, and other libraries might require us to buy an Azure Kinect).

Deliverables next week

By next week I want to have the Kinect working.

Schedule

The Kinect turned out to be a nontrivial part of our project, so I’m going to allocate a week to figure out how to use it. In terms of algorithm development, I am a little bit behind since I have to study LSTMs in more detail.

Mark’s status report 2/22

Progress

This week I finalized the design for the glove component of our project.  Below is the system level diagram that I created.

It took longer than expected to finish the design phase of my portion because I made sure to be extra careful to ensure that the system would work.  Not using a resistor to connect the buttons to the GPIO pins of the pi means that these buttons will essentially be active low, since they are connected to ground.  In the code that I will write to register the presses I will just have to account for that fact.

Schedule/Deliverables

I have made some adjustments to my overall schedule.  Now that my design is done, I will be compiling a formal bill of materials and placing an order this week.  I have already looked ahead at lead times for some of the more important components and things should not take longer than one week to arrive.

 

Team Report For 2/22

Chris/Tony

  • Created simple classifier for stomps and claps
  • Working on refining and removing false negatives, and improving the timing

For next week: create an action classifier using our research to do start and stop recognition

Most promising so far: https://github.com/felixchenfy/Realtime-Action-Recognition

Mark

Finalized glove design this week.  Created system diagram and a bill of materials to send for order.  Went through a couple iterations during design phase which is why the parts order has not been placed yet.

All:

Currently on schedule, however, we may have to use more of our slack to do more in the action classifier

Chris’ Status Report for 2/22

This week I implemented a way to detect stomps through our OpenPose processed video of just a stomp.

Currently, however, I also processed a video with Tony just flailing his arms (should be no detection) and it still did detect it as a stomp if his legs moved at all, even when it was clearly not a stomp. Working to fix by adding more parts of the body into the classification of a stomp.

For this week, we were given AWS credit that we wanted to use to process video on so we could process longer video and test on that without having to process on our own machines

I got it partially working, but there were some issues with compatibility with our EC2 Instance so I am continuing to work on that

Introduction and Project Summary

CV studio attempts to create a user friendly music development platform for gesture based music creation.  Using an x-box Kinect camera system along with a buttoned glove with additional features, this project intends to allow users to generate instrumental sounds with basic movements such as a clap or a stomp.  Our project will be connected to an existing music production software that allows users to save short segments of music and tune their work to their liking.  One central goal of the project is to be able to capture five different gestures with accuracy of 90%.

Mark’s Status Update for 2/15

Progress

Last week I began designing the glove component of our project.  I finalized  a high level system diagram for the component and made sure that the hardware components are sufficient for the intended use case.  I also assisted team members in obtaining camera hardware and connecting the Kinect equipment to our computers.  I also started an outline for our design review presentation and began working on the slides.

Schedule

I am making good progress on the glove design, but the original schedule had noted that I should have ordered parts for the glove last week.  I am planning on submitting part orders this week and devoting more time than I originally intended to the design phase of the project.

Next Week

Next week I plan on finalizing the design of the glove component of our project and putting together slides for the design review.