Darwin Torres’ Status Report for 4/30

This week, I worked on making our taps and drags more responsive. Before working on double taps, we felt it was best that the we improved the performance of our single-tap interactions. Overall, things were fine for our MVP, however there were a few hiccups, mainly with taps sometimes being registered as small drags and drags not appearing very smooth. Matt K and I came up with a solution where we would ignore additional position updates that occur within a small radius around an initial touch. This makes it less likely that we detect a small drag when the user is really trying to tap. We found that this solution was able to fix our first problem. To make drags smoother, we averaged every couple of position updates before sending a request to the OS. This meant a slower refresh rate, but smoother interpolation of coordinates along a drag. We tinkered around with how many position updates should be averaged, and we found 5-7 gave the best results. Overall, I feel I am on schedule, especially since the tasks that we have remaining are in addition to our MVP. Although I wanted to have a simple double tap test done this week, I felt the changes we made this week were more important to the user experience. Next week, I will work on implementing double taps so that we can hopefully be able to showcase it during our poster session.

Darwin Torres’ Status Report for 4/16

This week, with progress finally being made with our PCBs, I worked on updating our touch control interface and tests from the interim demo to be compatible with the final hardware. I also helped Matt S with soldering the final surface-mounted components on the remaining PCB boards. The tests that I developed have yet to be used with the hardware as I need Matt K’s updated code, which he is also updating to match our new environment, to be fully working so that we can check how physical inputs translate to touch controls on the OS. We are currently in the process of debugging our work so that we can proceed with physical testing on the final hardware, and we are aiming to have this done by Monday. With the conclusion of the week I am well on schedule, and overall things have been ramping up recently since we are finally back on track after receiving the decoders. Next week, I will work with Matt K in squashing our final bugs so that we can start testing to not only check the robustness of the software and hardware, but to also record relevant data to compare against the requirements we set in our design review. We will start by testing physical input along one dimension, which we already have the hardware ready for. This is essentially a scaled up version of our interim demo tests. Once the second axis is ready, we will test touch controls along both axes, which would be a major milestone as this will give us the capability to interact with any point along the 2-dimensional space of the screen.

Darwin Torres’ Status Report for 4/10

This past week, I helped the team in preparing our interim demo. The integration of subsystems B and C has been going well so far, so Matt and I designed some visual tests to showcase in the demo. Unfortunately, our decoders have not arrived, so subsystem A has yet to arrrive, but we were able to perform some initial “physical” testing with a breadboarded circuit. This allowed us to test our code that sends signals to the LEDs and processes signals from the photodiodes. Tests were successful, and we were able to set up a simple demo showcasing how interacting with the LEDs results in onscreen manipulation through emulated touch inputs. It was good to know that my code had integrated well with Matt’s. There are still a few bugs and problems present, most notably with the precision of our setup, but it will be hard to measure this accurately until we have the PCB completed. In terms of my software-related tasks, I am on schedule. Unfortunately, we have delays with the hardware, but we should be up to speed soon. Next week, Matt and I will work together in debugging our code. Once the decoders arrive, we will work together in completing the PCB so we can test subsystem A and get it ready for integration with our software. If I have extra time, I will start investigating multi-finger functionality within Windows.

Darwin Torres’ Status Report for 4/2

With the PCB and components finally in our hands, I worked with the team in soldering components onto the PCB. Unfortunately, progress has been slowed due to a mixup with decoders, but other than that, we are starting to look good on the hardware aside. In addition to this, I continued working with Matt on the Arduino-Python integration. Our preliminary communications tests have been successful, and we are now focusing on creating a simulated test suite that mimics physical input. Once we have the hardware setup and integrated with the Arduino, we plan on updating this test suite to take in real inputs from the hardware. Other than the one setback we have due to having to order new decoders, I feel that I am on track. Any progress lost due to the decoders will be made up with the slack time we have. Next week, I will continue to help my team in finishing the PCBs and setting up our interim demo. Matt and I will continue our “virtual” testing and bug squashing as we wait for the hardware to be ready.

Darwin Torres’ Status Report for 3/26

This past week, Matt K and I commenced the integration between the Arduino data processing and Python Screen control interface. Time has been spent understanding how to communicate between the Arduino and our Python interface via USB, which would allow us to send touch control requests to the OS directly from the Arduino. We have a plan outlined and have commenced simple tests for communicating touch type and position. As we wait for the hardware subsystem to be completed, testing will consist of creating dummy coordinates on the Arduino and sending certain commands to the OS through the Python interface. To confirm that the observed output is accurate, we compare with touch control requests sent from a local Python test script. Next week, I will get together with the others to solder the components onto our PCB and hopefully start testing our hardware subsystem. With the slack time we have and the progress I have made, I feel that I am on track with the schedule.

Darwin Torres’ Status Report for 3/19/2022

This week, I continued testing for the screen control interface. These tests have all been through software, emulating requests via python scripts. Once we have an initial frame set up, we can start testing directly with physical user input. While we wait for parts to come in, software integration is starting to commence, as Matt’s Arduino subsystem starts to join with the Python touch-control interface. This would allow us to start testing the sending of requests from the Arduino via USB. In the coming weeks, I will help with the PCB once our parts arrive, and until then I will be able to allocate most of my time to further testing, which should hopefully result in a smoother integration experience and less debugging once we are able to conduct our physical tests. On the software side, I feel that I am on schedule, though on the PCB I am a bit behind, but this is because we are still waiting for parts. Overall, this isn’t an issue as I am able to put more time into tweaking my scripts as I mentioned before.

Darwin Torres’ Status Report for 2/26/2022

This week, I made additional changes to the screen control interface. Previously, the main functionality that was added was the ability to send a request to the OS to emulate a touch tap. That is, given a pair of (x,y) coordinates, we emulate a finger pressing down and then immediately lifting up. However, these two steps are actually separate events that are sent to the OS – finger down, finger up – we just combined them into one quick event. If an extended period of time passes before “finger up”, Windows applications can register this as a finger hold. Thus, changes had to made so that we can support these touch-specific gestures. This was done by updating the interface to support three requests to the OS: finger down, finger up, and update position. The “update position” allows us to change the position of the finger while it is down, simulating a drag. These requests have been tested using a test script in python and result in successful calls individually, however the drag mechanism still needs to be ironed out. Overall, everything is going according to schedule, and I hope to conduct more testing in the coming weeks as we move on to integration between the hardware, Arduino, and Windows. Once integrated, we can conduct better tests as we will be able to use our fingers to directly interact with the UI. Unfortunately, current tests do not tell us much about usability as everything is simulated without physical interaction, so I am excited about these future steps.

Darwin Torres’ Status Report for 2/19/22

This week, I continued with developing the screen control interface. Originally, we were going to register taps detected with our sensors as mouse clicks using one of the libraries mentioned last week. However, we found that we can instead send touch control requests using Microsoft’s Win32 C++ API. We are still using Python, but taking advantage of the builtin ctypes library to create wrappers that directly call the Win32 C++ functions. The provided methods for touch manipulation support not only single taps, but also drag and multi-touch gestures, unlike the cursor-control libraries we were looking at before. This makes it more intuitive to add support beyond single-contact touches. I developed a simple interface for emulating a tap at a pair of (x,y) coordinates and testing found that the tap was successful and perceivably instantaneous. Libraries such as PyAutoGUI can take up to hundreds of milliseconds to complete a single mouse-click which would hurt the user experience. Using this new method, a touch was registered in less than a millisecond, giving us more room to work with on the hardware side as we work toward a goal of an overall response time of less than ~150ms. Overall, I am ahead of schedule, giving me time to test out simulating drag and learn more about the additional capabilities of the API. Next week, I will help in starting the breadboard/Arduino integration, while also possibly being able to start the Windows integration early in our schedule.

Darwin Torres’ Status Report for 2/12/22

This week, I conducted research on how to interface with Windows to register our screen taps as mouse clicks. I have begun exploring different Python libraries and have narrowed down our choices to the following:

  • pywin32
  • PyAutoGUI
  • mouse

As of right now, the main focus is to create demos to test each library and compare their performance, ease-of-use, and compatibility. Originally, I was going to create a simulation to test different configurations of our original analog design, but as we have updated our approach, this was no longer necessary, which allowed me to get ahead of schedule and move to this step. This weekend, in addition to developing the demos for each library, I plan to create an interface that would allow us to switch between each library, allowing for easier integration with my team’s code as we test out our screen control software and decide on which library is best.