Team Status Report for 2/27/2021

For this week, our team presented our project proposal on Wednesday. Our team also adjusted our deadlines and pre-existing tasks on our Gantt chart, as well as adding some more new tasks and subtasks. We created a project repo for Whiteboard Pal on github, and we now have some scaffold code for our dataflow, aka how we will collect hand gesture and tracking data from webcam frames using opencv. Our members also did more research into the two individual areas of our requirements, the hand tracking and gesture detection, and we also looked at pre-existing literature and code surrounding those tasks.

Jenny’s Status Report for 2/27/2021

For this week, I worked with my group to adjust and expand our Gantt chart. I changed some projected deadlines around, added some more tasks and subtasks. Additionally, I did research into how opencv and hand detection work, while playing around with some sample code to test different features of opencv. I also worked on our team’s status report for the week. For next week, I intend to work on implementing our Whiteboard Pal version of detecting hands vs. no hands in a frame and then hands vs. pinching gesture.

Zacchaeus’ Status Report for 2/20/2021

This week was spent refining our ideas for our project. We spent the week making the presentation and fleshing out design decisions for our project.  We made plans for how the entire execution will go down and made a schedule that we are going to follow and update accordingly with how things go. Further we also assigned tasks to everyone so I can now start reading up on object tracking. We fleshed out our ideas and are preparing for the presentation we have next week.

Sebastien’s Status Report for 2/20/2021

This week the team spent a good deal of time going back/forth about narrowing down on a particular idea for our project after receiving some feedback from professor Kim. We considered what could be usable in conjunction with technical feasibility and ended up deciding to build an “air draw” whiteboard in the form of a “server” that consumes camera frames and produces a “drawing feed” that’s something along the lines of a sequence of (x, y, isDrawing) tuples sent over an inter-process-communication channel. A generic module like this allows many applications to use it. We want to build an actual UI with it, but we also want to focus on the whiteboard server itself rather than building a UI – luckily it’s possible to open a UNIX-domain socket in a browser via the WebAssembly System Interface (WASI), so right now I’m pretty sure we’re going accomplish this by forking excalidraw and adding a WebAssemply stub that reads the “drawing feed” and calls relevant javascript functions to draw on the canvas. I also set up a team notion as a place to stay organized that’s easier to use than google docs and has task-management built-in, including Kanban boards that are viewable as a Gantt chart.

Team Status Report 2/20/2021

For this week, our team managed to finalize our project idea and expand upon the plans. We had started the week with several iterations of different ideas, but decided upon the hand tracking whiteboard idea. We met up and spoke with our project TA, Uzair, and instructors, Professor Kim and Professor Mukherjee, who gave us advice for the project itself and the proposal.

For our project idea, we were advised to really narrow down our problem statement and idea to do one thing really well. After brainstorming, we begun and finished the project proposal slides. We were given feedback that we should cut down on the words and include more images and diagrams, which we went back to and finished.

Jenny’s Status Report for 2/20/2021

For this week, I worked on finalizing our project idea and the proposal slides with my team. We oscillated between several iterations of the idea before setting on a final one. I created the sketches/diagrams for the slides, as well as participated in work sessions together with my teammates to write down the content on each of the slides. In addition, I also set plans and a timeline this week for what the team should seek to accomplish regarding the proposal and blog posts. I also worked on the team status report for WordPress.

Introduction Post and Summary

Problem Statement

There are many times students and instructors need to utilize the Zoom Whiteboard feature, but they do not have tablets, iPads, or touchscreen to easily draw. Using the mouse is both clunky and slow, turning people off from this useful feature of remote meetings.  

Our Solution/Implementation

Our Whiteboard Pal uses Machine Learning to identify gestures and computer vision to allow users to accurately draw on the screen by tracking the user’s fingers and hand.

We will use opencv, the computer vision module, to track the user’s pointer finger on camera and accurately reflect it on screen with the mouse as the user moves their hand around.

To detect whether the mouse is being pressed or held down, we offer the option of the user performing a gesture to indicate it, or pressing and holding down a keyboard key at the same time as the user is moving their hand.

ECE Areas

Software Systems + Signal Processing