Darwin Torres’ Status Report for 4/2

With the PCB and components finally in our hands, I worked with the team in soldering components onto the PCB. Unfortunately, progress has been slowed due to a mixup with decoders, but other than that, we are starting to look good on the hardware aside. In addition to this, I continued working with Matt on the Arduino-Python integration. Our preliminary communications tests have been successful, and we are now focusing on creating a simulated test suite that mimics physical input. Once we have the hardware setup and integrated with the Arduino, we plan on updating this test suite to take in real inputs from the hardware. Other than the one setback we have due to having to order new decoders, I feel that I am on track. Any progress lost due to the decoders will be made up with the slack time we have. Next week, I will continue to help my team in finishing the PCBs and setting up our interim demo. Matt and I will continue our “virtual” testing and bug squashing as we wait for the hardware to be ready.

Darwin Torres’ Status Report for 3/26

This past week, Matt K and I commenced the integration between the Arduino data processing and Python Screen control interface. Time has been spent understanding how to communicate between the Arduino and our Python interface via USB, which would allow us to send touch control requests to the OS directly from the Arduino. We have a plan outlined and have commenced simple tests for communicating touch type and position. As we wait for the hardware subsystem to be completed, testing will consist of creating dummy coordinates on the Arduino and sending certain commands to the OS through the Python interface. To confirm that the observed output is accurate, we compare with touch control requests sent from a local Python test script. Next week, I will get together with the others to solder the components onto our PCB and hopefully start testing our hardware subsystem. With the slack time we have and the progress I have made, I feel that I am on track with the schedule.

Team Status Report for 3/26

This past week, we have continued integration between subsystems B (Arduino data processing) and C (Python Screen Control Interface). This will allow us to send touch control requests to the OS from the Arduino via USB. Integration between subsystems A and B will begin on a later date, as we still need to finish the PCB and frame. We have been held back on the arrival of our components, but fortunately we finally have them on our hands and are eager to start getting our hardware subsystem ready. Next week, we plan on starting the assembly of subsystem A by dedicating our time to PCB soldering and finalizing frame design, potentially also starting frame construction. Given the slack time we gave ourselves, and our ability to shift focus on the software as we waited for parts, we feel we are currently on schedule with our tasks.

Darwin Torres’ Status Report for 3/19/2022

This week, I continued testing for the screen control interface. These tests have all been through software, emulating requests via python scripts. Once we have an initial frame set up, we can start testing directly with physical user input. While we wait for parts to come in, software integration is starting to commence, as Matt’s Arduino subsystem starts to join with the Python touch-control interface. This would allow us to start testing the sending of requests from the Arduino via USB. In the coming weeks, I will help with the PCB once our parts arrive, and until then I will be able to allocate most of my time to further testing, which should hopefully result in a smoother integration experience and less debugging once we are able to conduct our physical tests. On the software side, I feel that I am on schedule, though on the PCB I am a bit behind, but this is because we are still waiting for parts. Overall, this isn’t an issue as I am able to put more time into tweaking my scripts as I mentioned before.

Darwin Torres’ Status Report for 2/26/2022

This week, I made additional changes to the screen control interface. Previously, the main functionality that was added was the ability to send a request to the OS to emulate a touch tap. That is, given a pair of (x,y) coordinates, we emulate a finger pressing down and then immediately lifting up. However, these two steps are actually separate events that are sent to the OS – finger down, finger up – we just combined them into one quick event. If an extended period of time passes before “finger up”, Windows applications can register this as a finger hold. Thus, changes had to made so that we can support these touch-specific gestures. This was done by updating the interface to support three requests to the OS: finger down, finger up, and update position. The “update position” allows us to change the position of the finger while it is down, simulating a drag. These requests have been tested using a test script in python and result in successful calls individually, however the drag mechanism still needs to be ironed out. Overall, everything is going according to schedule, and I hope to conduct more testing in the coming weeks as we move on to integration between the hardware, Arduino, and Windows. Once integrated, we can conduct better tests as we will be able to use our fingers to directly interact with the UI. Unfortunately, current tests do not tell us much about usability as everything is simulated without physical interaction, so I am excited about these future steps.

Darwin Torres’ Status Report for 2/19/22

This week, I continued with developing the screen control interface. Originally, we were going to register taps detected with our sensors as mouse clicks using one of the libraries mentioned last week. However, we found that we can instead send touch control requests using Microsoft’s Win32 C++ API. We are still using Python, but taking advantage of the builtin ctypes library to create wrappers that directly call the Win32 C++ functions. The provided methods for touch manipulation support not only single taps, but also drag and multi-touch gestures, unlike the cursor-control libraries we were looking at before. This makes it more intuitive to add support beyond single-contact touches. I developed a simple interface for emulating a tap at a pair of (x,y) coordinates and testing found that the tap was successful and perceivably instantaneous. Libraries such as PyAutoGUI can take up to hundreds of milliseconds to complete a single mouse-click which would hurt the user experience. Using this new method, a touch was registered in less than a millisecond, giving us more room to work with on the hardware side as we work toward a goal of an overall response time of less than ~150ms. Overall, I am ahead of schedule, giving me time to test out simulating drag and learn more about the additional capabilities of the API. Next week, I will help in starting the breadboard/Arduino integration, while also possibly being able to start the Windows integration early in our schedule.

Darwin Torres’ Status Report for 2/12/22

This week, I conducted research on how to interface with Windows to register our screen taps as mouse clicks. I have begun exploring different Python libraries and have narrowed down our choices to the following:

  • pywin32
  • PyAutoGUI
  • mouse

As of right now, the main focus is to create demos to test each library and compare their performance, ease-of-use, and compatibility. Originally, I was going to create a simulation to test different configurations of our original analog design, but as we have updated our approach, this was no longer necessary, which allowed me to get ahead of schedule and move to this step. This weekend, in addition to developing the demos for each library, I plan to create an interface that would allow us to switch between each library, allowing for easier integration with my team’s code as we test out our screen control software and decide on which library is best.