This week I modified the Unity line rendering scripts to accept input from the hardware pen fed through Uduino that my partners worked on. I modified the script so that by touching the screen, a new anchor point would be created at that location in the world, and is where any new drawings would be placed until you touch the screen at a different location. I have also started working on the cloud anchoring, which would allow multiple devices to see the drawings created by one user/device.
Testing-wise, through use of the application, one can verify that the line algorithm works properly, as the lines drawn are very accurate and smooth where you draw them with your finger. When the hardware pen is fully able to function with the software, our testing would be comparing the lines drawn by the hardware pen with the lines drawn with the finger, making sure they aren’t jaggy and instead are smooth. I’d say that we are on schedule as we have done the main work in integrating the hardware and software and are simply working through the kinks of getting it all running smoothly right now.