Darwin Torres’ Status Report for 4/30

This week, I worked on making our taps and drags more responsive. Before working on double taps, we felt it was best that the we improved the performance of our single-tap interactions. Overall, things were fine for our MVP, however there were a few hiccups, mainly with taps sometimes being registered as small drags and drags not appearing very smooth. Matt K and I came up with a solution where we would ignore additional position updates that occur within a small radius around an initial touch. This makes it less likely that we detect a small drag when the user is really trying to tap. We found that this solution was able to fix our first problem. To make drags smoother, we averaged every couple of position updates before sending a request to the OS. This meant a slower refresh rate, but smoother interpolation of coordinates along a drag. We tinkered around with how many position updates should be averaged, and we found 5-7 gave the best results. Overall, I feel I am on schedule, especially since the tasks that we have remaining are in addition to our MVP. Although I wanted to have a simple double tap test done this week, I felt the changes we made this week were more important to the user experience. Next week, I will work on implementing double taps so that we can hopefully be able to showcase it during our poster session.

Team Status Report for 4/30

This week, we spent most of our time working on obtaining testing values for our presentation. We felt that this went successfully since all of our tests passed. Since we have achieved our MVP, we are not too concerned about obstacles that we may face. Our focus is now to get multi-finger functionality working, which will begin with two finger zoom. Additionally, we may try to make a frame to improve the appearance of our project. All updates to our schedule were shown during the presentation.

Matthew Kuczynski’s Status Report 4/30

Since our MVP was achieved last week, I spent the beginning of the week testing our use case requirements and preparing for the final presentation where I was the speaker. Later in the week, I worked with Darwin on creating a threshold for distance for drags and averaging the position for consecutive data collections where the finger is down so that less update commands are sent and actions like drawing are smoother. Overall, I feel that I am on schedule since our MVP is working and we have completed testing. Next week, I plan to work on the algorithms for two finger gestures such as zoom.

Team Status Report for 4/23

This week, we connected all edges of the frame and spent most of our time debugging the hardware and software, as well as collecting data for measuring our performance against our use-case requirements. Beginning with the hardware, one issue we encountered was a misprint in the bottom PCBs. We were able to hack around this by soldering wires onto the board to fix the broken connection. Before we started testing our software with the frame, Matt S conducted some final debugging and checks to make sure there were no issues with the hardware.

Once the hardware was ok to go, we attached the frame to Matt S’ laptop and began testing of Matt K’s updated Arduino code, which controls the LED/Photodiode pairs and sends data about which pairs are activated to the python backend via USB serial. We encountered some bugs, but eventually after some squashing we were able to confirm that the control signals and output data were correct.

Afterwards, once we confirmed the Arduino subsystem was well-integrated with the hardware, we began our first true test of our entire system through a python script that used the data sent over by the Arduino to calculate coordinates and send commands to Darwin’s updated Touch Control subsystem. At first, a few bugs had came about due to a small miscommunication about the updated interface, but we were able to quickly fix them. Once we were passed that, we finally got our first glimpse at Touch TrackIR in action! Physical finger taps, holds, and drags on the screen were successfully translated to the expected touch inputs in the OS. We were able to click on links, close tabs, move windows, and scroll through webpages. The smoothness of the experience is acceptable, but we still feel there is room for improvement.  For example, we can reduce latency by increasing baud rate, removing print statements, and by simplifying the structure of the data that is being used in our coordinate calculations to eliminate unnecessary loops.

Overall, thanks to the progress this week, we put ourselves perfectly on schedule. Next week, we plan on constructing a cover for our PCB frame and adding support for double taps to allow us to perform “right clicks”.

Matthew Kuczynski’s Status Report for 4/23

This week we worked on integrating our subsystems, so much of my work involved testing the Arduino and Python serial code that I had finished last week and debugging it. Previously, I had not been able to test my code because the PCB wasn’t finished, so this week I was able to get it working properly. Then, I worked on getting the testing metrics for our requirements so that they could be reported in the final presentation. Since we have our MVP working, I am definitely on track. Next week, I will now work on adding additional features to our project like multi-finger functionality and improving the speed of the single touch function.

Darwin Torres’ Status Report for 4/16

This week, with progress finally being made with our PCBs, I worked on updating our touch control interface and tests from the interim demo to be compatible with the final hardware. I also helped Matt S with soldering the final surface-mounted components on the remaining PCB boards. The tests that I developed have yet to be used with the hardware as I need Matt K’s updated code, which he is also updating to match our new environment, to be fully working so that we can check how physical inputs translate to touch controls on the OS. We are currently in the process of debugging our work so that we can proceed with physical testing on the final hardware, and we are aiming to have this done by Monday. With the conclusion of the week I am well on schedule, and overall things have been ramping up recently since we are finally back on track after receiving the decoders. Next week, I will work with Matt K in squashing our final bugs so that we can start testing to not only check the robustness of the software and hardware, but to also record relevant data to compare against the requirements we set in our design review. We will start by testing physical input along one dimension, which we already have the hardware ready for. This is essentially a scaled up version of our interim demo tests. Once the second axis is ready, we will test touch controls along both axes, which would be a major milestone as this will give us the capability to interact with any point along the 2-dimensional space of the screen.

Matthew Kuczynski’s Status Report for 4/16

This week was difficult for me because I had COVID, so I was unable to meet with my group and did not feel good Monday – Wednesday. However, I was able to make significant progress on the software at the end of the week. My primary goal was to finish the Arduino script, which I accomplished, however it has not yet been tested due to some issues with PCB that existed until about an hour ago. Additionally, I wrote the new Python serial script, so the arrays of photodiode values can be communicated to the Python side. Previously, we had decided that the photodiode values should be communicated as a set of bit-encoded integers rather than a string to improve speed. Since the maximum integer size is a 32-bit long, three were needed since we have a 56 by 31 array (two with 28 for the top and one with 31 for the right). I also developed a protocol to know which values is being sent, which is to send either a -1, -2, or -3 before each value based on which of the three longs it represents. Since we never use all 32 bits for our longs, these negative values should never appear as the encoded photodiode values. Overall, I believe that I will be able to finish the testing in time, so I think that I am on schedule as long as I put enough work in this week and there are no more major hardware issues. Next week, I will debug my Arduino script, the Python serial code, and then work on multi-finger functionality.

Darwin Torres’ Status Report for 4/10

This past week, I helped the team in preparing our interim demo. The integration of subsystems B and C has been going well so far, so Matt and I designed some visual tests to showcase in the demo. Unfortunately, our decoders have not arrived, so subsystem A has yet to arrrive, but we were able to perform some initial “physical” testing with a breadboarded circuit. This allowed us to test our code that sends signals to the LEDs and processes signals from the photodiodes. Tests were successful, and we were able to set up a simple demo showcasing how interacting with the LEDs results in onscreen manipulation through emulated touch inputs. It was good to know that my code had integrated well with Matt’s. There are still a few bugs and problems present, most notably with the precision of our setup, but it will be hard to measure this accurately until we have the PCB completed. In terms of my software-related tasks, I am on schedule. Unfortunately, we have delays with the hardware, but we should be up to speed soon. Next week, Matt and I will work together in debugging our code. Once the decoders arrive, we will work together in completing the PCB so we can test subsystem A and get it ready for integration with our software. If I have extra time, I will start investigating multi-finger functionality within Windows.

Matthew Kuczynski’s Status Report for 4/10

This week I worked on preparing for our demo, so I worked with Darwin and finished writing some test scripts could should that our two subsystems could work together. Additionally, I worked with both Matt and Darwin to write a test that went through all three subsystems, which was demonstrated at our demo. These tests revealed flaws in my subsystem, so I also spent a significant amount of time debugging the code that I had already written. Once we had our demo working, I worked on the Arduino script for LED control and photodiode readings. Overall, I feel like I am on schedule since we made significant progress with the integration this week, however, we need the decoders to arrive or I will quickly fall behind. The biggest thing that I have left involves physical testing, which requires the hardware subsystem to be fully functioning. Next week, I plan to finish the Arduino script for LED control and photodiode readings. If the decoders come in, I will also begin physical testing, otherwise I will begin to work on multi-finger functionality.

Team Status Report for 4/10

This week, much of our focus was on the project demo. While we are still awaiting the decoders for our PCBs, we were able to make use of the test breadboard from earlier in the semester. On the software side, we developed a script to showcase touches on this breadboard to create clicks on the screen. While this was a good way to give an idea of how our project works, this week we must try to finish the functionality of single-finger gestures by using state logic to distinguish between clicks and finger slides. We should also try to minimize glitches that occur from LEDs being blocked for only brief moments in time. On the hardware side for the coming week, all we can do is hope that the decoders finally come in. If they do not, we will likely need to start hacking through-hole versions of the decoders onto the board so we can start debugging appropriately, especially since we have started eating into our slack time.