Joseph’s Status Report 4/29

This week I spent time trying out different methods of how exactly anchor points are set during the game. I tried having the anchor point set at the start by the user when tapping the screen and not changing and having the anchor point set whenever the user taps the screen. From user testing, the latter method seemed more user friendly, as the extra flexibility in placing the object seemed to outweigh the accidental screen taps. I also changed and tested various different parameters I coded into the line algorithm (# of corner and end vertices in the line, distance between each point in the line, line width, etc.), in order to see which parameters produced to best formed line. This testing gave me a good idea of the good range of values, but further testing will have to be done with this once the data sent over by the hardware is perfected.

Due to the revamp of the multiplayer aspect, the pure software side I’ve been working on is on schedule. But as a whole, our project is a little behind schedule, only due to some of the data being sent over from the hardware being a little off. No changes to our schedule are happening, we are simply trying to get the hardware data to be completely fine by demo day so we have a functional game.

This week, most of our time will be spent on the final assignments, those being the final report, poster, and video. Other than that, once my teammates are done stabilizing the hardware data sent over to the software, I will further finetune all the separate software parameters in the line algorithm renderer.

Joseph’s Status Report 4/22

This week, in addition to working on the final presentation slides, I added the ability to clear all rendered lines from the screen in order to “start over” as well as added an on-screen debugging log for easier troubleshooting and game status tracking. I have also written out and finally been able to get to work the scripts that get cloud anchoring to work on one device, in a separate test game scene (not our actual game scene). What would be left would be to integrate this cloud anchoring into the previously written line algorithm scripts and main game scene, as well as use Unity’s networking package to get the cloud anchoring functional across multiple devices. After that, finetuning of the timing of API calls would be necessary.

Considering that we effectively only have 1 week left of classes before finals week and are still having a little trouble fully integrating the hardware and software (improperly formed/calibrated data is able to be sent over from the hardware and rendered by the software), I have looked into backup options of viewing our game on multiple devices. This is due to the fact that even if a fully functional multiple device cloud anchoring was able to be coded in this last week, we would have to worry about not only integrating this software with the hardware on a single device, but across multiple separate devices, which would require even more code to be written in order to separate game states, and actions between separate users.

Due to the unfeasibility of all of this given the remaining timeframe, I have instead gotten to work a way of screen sharing our game running on our host android device to another device/laptop that would be displayed on a big monitor. This was done with an application called TeamViewer, which allows one device to be used as a host and other devices to hook into this device and view what is being displayed on it. Therefore, all users would still effectively be able to see what is being drawn in the real world and guess in real-time, and the drawer would simply be changed to whoever holds the host device and pen.

This change would work towards putting our project as a whole back on track, allowing us to have a refined and functional project by final demo time. This week I plan to help the team with refinement of the line rendering algorithm that takes in the data gotten from the hardware as well as thoroughly test and refine the game as a whole into an easy-to-use and smooth experience.

Joseph’s Status Report for 4/8

This week I modified the Unity line rendering scripts to accept input from the hardware pen fed through Uduino that my partners worked on. I modified the script so that by touching the screen, a new anchor point would be created at that location in the world, and is where any new drawings would be placed until you touch the screen at a different location. I have also started working on the cloud anchoring, which would allow multiple devices to see the drawings created by one user/device.

Testing-wise, through use of the application, one can verify that the line algorithm works properly, as the lines drawn are very accurate and smooth where you draw them with your finger. When the hardware pen is fully able to function with the software, our testing would be comparing the lines drawn by the hardware pen with the lines drawn with the finger, making sure they aren’t jaggy and instead are smooth. I’d say that we are on schedule as we have done the main work in integrating the hardware and software and are simply working through the kinks of getting it all running smoothly right now.

Joseph’s Status Report for 4/1

This week I was finally able to complete the line rendering algorithm. In other words, I  have successfully ported to the phone, a Unity app that draws a 3d line in space whenever you hold a finger down on the screen and move the phone around. Right now the way we decide the points in space where the lines are drawn between is gotten from the touch input on the screen. Eventually, we plan to replace this touch input with the data gotten from the hardware pen. This is something that we hope to be able to do this following week using Uduino, even if the data isn’t 100% accurate. Otherwise, we are on schedule, as integration of hardware and software was planned for a little later. While my teammates figure out the Uduino, I plan to start integrating cloud anchoring into the software so the line drawings are stored in the cloud and able to be viewed by multiple devices.

Joseph’s Status Report for 3/25

I am still in the process of coding the line rendering algorithm, so that is what I’ve been working on for the week. Besides that, I helped Anthony set-up the Android phones that were received and push the project build to the phone successfully. I also made a team Unity account so we can share paid Unity packages that are downloaded. I’ve tested that this works, having downloaded and imported the Uduino packages into the project and pushed them to Github.

This week was pretty busy for me due to all the preparation for Greek Sing happening today, so I didn’t manage to get as much done code-wise as I hoped. This means I am behind schedule, but I am still hopeful in getting a mostly functional algorithm by the Interim deadline.

Joseph’s Status Report for 3/18

This week I started writing out the code for the line calculation algorithm. The lines are yet to actually be rendered and tested with dummy data, but the basic code structure and pipeline have been written out. In addition to this, I also read and worked on the ethics assignment and readings that we were assigned to do. Schedule-wise, I am still a bit behind, but once the line algorithm has been written out and tested I should be back on schedule. I hope to finish up the line calculation and rendering algorithm in the following week and test it out for bugs.

Joseph’s Status Report 3/04

This week I spent time solidifying how exactly the line calculation algorithm will be designed, codewise. Also, I decided which unity gameobjects would be best used in order to render the lines, that being the line renderer object over the trail renderer object. This is due to the line renderer working better with the locational data points we will obtain from the hardware, as well as better integration with the cloud anchor objects necessary to make the drawings able to be seen on multiple devices. Other than that, the rest of the time was spent working on the Design Report, which took quite a bit of time.

Unfortunately, I was hindered this week and spring break by a medical issue that cropped up Monday the 27th, that being extremely bad stomach pain that came on whenever and always lasted for hours, rendering me incapable of anything. I went to the ER Saturday the 4th when it wouldn’t get better and instead got worse, in which they didn’t find anything obvious and life-threatening luckily, though also weren’t exactly sure what it was.

(medical note attached)

Luckily, after some follow-up visits, time and medicine I seem to finally be getting better as of today, so I’m hopeful to get more done this following week, which would involve me working towards a functional line rendering script with test inputs. I am somewhat behind schedule due to this issue, as I should’ve started actually coding the line algorithm by this point, but from previous research on the best ways to implement this, I am confident I can code a functional line renderer within our set timeline.

Joseph’s Status Report for 2/25/2023

This week was mostly spent going over some issues brought up in our design presentation and working on our design report. I have messed around with a bunch of the extra gameobjects and functionality provided by the ARFoundation and ARCore packages within Unity. I was delayed by a Unity bug I have never gotten before, in which the Unity Hub failed to open any of the Unity projects I have downloaded on my laptop, which took a decent amount of time to figure out.

Our progress is mostly on schedule, maybe a little behind due to hardware parts taking time to get here. In order to catch up, we will focus heavily on the hardware for the next week. We hope to have this flushed out in the next week.

Joseph’s Status Report for 2/18/2023

This week I spent a lot of time working out some kinks in actually getting the AR application I built in Unity to port to an android phone without problems. I solved the app ported to the phone crashing for no reason sometimes, and also the app simply showing a black screen instead of the real world sometimes.

The rest of the time this week was spent working on the design presentation slides and preparing to present them. This involved doing more in-depth research into the packages and systems we would be using, in order to fully flesh out our pipeline for the project. Our progress is on schedule therefore.

I hope to build in Cloud anchoring functionality into the test AR app I have made so far, in the next week, as that will be a core part of our project.

Joseph’s Status Report for 2/11/2023

This week I created and setup our Unity project and environment. I have setup the project to build to Android as well as downloaded all the necessary unity packages to get AR working. I have built a small sample AR “game” in order to test all of this. I have also setup our git repository along with git LFS and some gitignore edits in order to ensure all the Unity files can be stored remotely regardless of size. All the files were successfully added, committed, and pushed without error.

I will confirm that my other teammates can successfully clone and run the project. If necessary, I will also setup the project to work on IoS. Afterwards, I will join Anthony in doing more in-depth research into the packages we will use in implementing our project.