Sophia’s Status Report 04/29

This week, I spent a lot of time trying to refine our pen data so drawing lines would be more refined. Unfortunately, while there seems to be some semblance of direction being derived, we cannot seem to make the z-coordinates stop changing (due to under or over compensation of the force of gravity). Aside from trying different approaches of calculating position such as taking the average of acceleration rather than an instantaneous point of acceleration, none seem to work any better than the initial method of a Kalman filter with quaternion rotation and double integration. Because of this, I am leaning towards sticking to this method and trying to tinker with the calibration settings. Additionally, this week, I tried to finalize our board by soldering our Arduino and components to a PCB that was given to us. Unfortunately, the one we received was a touch too small, so I will likely be trying to find a larger one or purchase a new one that will hopefully arrive soon.

As of now, I am falling a little bit behind schedule as ideally now the hardware would be completely finished. However, I just mostly need to solder everything together at this point, which ideally shouldn’t take too long.

Next week, I will spend some more time writing our final report, creating our poster, and creating our video for demos. There will also be some time spent on soldering, but that should be a quick task.

Team Status Report 4/22

Currently our project is at risk of, in general, not functioning to MVP due to some setbacks we have had with hardware integration and implementing cloud anchoring + networking for use with multiple devices. These risks are currently being addressed as we try our best to meet our set deadlines in the project. If needed, there are some backup implementations, where we can use other methods to achieve a similar effect, like screen sharing or sending position data using the touch screen.

One of the major changes that we made was our shift to using Arduino system and bluetooth instead of using wifi, since there was much more stability ensured with our Arduino and the bluetooth hardware we choose to use (HC-05). This also meant using different bluetooth packages with unity instead of the uduino packages we were using previously. This change was necessary since it actually had compatibility with android devices, which was where we were trying to eventually build our final product.

The other major change we have made is the replacement of cloud anchoring across multiple devices with screen sharing across multiple devices. This would be done by screen sharing our host android device to another device/laptop that would then be displayed through HDMI cord on a big monitor. This was done with an application called TeamViewer, which allows one device to be used as a host and other devices to hook into this device and view what is being displayed on it. Therefore, all users would still effectively be able to see what is being drawn in the real world and guess in real-time, and the drawer would simply be changed to whoever holds the host device and pen.

These design tradeoffs would allow us to stay on schedule towards a mostly completed and functional project, while not taking away too much from the game and user experience itself. Therefore, our remaining amount of time before the final demo could be better used refining to perfection what we have already implemented rather than attempting to implement more stuff that may or may not work properly.

In general,  this week we spent some more time working on integration and our final presentation slides. Everything is mostly integrated from end to end, but there are still some kinks that need to be flushed out that are addressed and explained in further detail in our individual status reports.

Sophia’s Status Report 4/22

This week I spent most of the time fine tuning our position data, helping Anthony integrate the our pen with our Unity App, and working on our final presentation slides.

After trying to compensate for the drift in our IMU as well as errors in rotation, I have decided that it would be easier to use built-in quaternions to be able to get more accurate rotational data. Additionally, I have added a calibration phase to before you can actually use the pen device in order to help reduce error. This seemed to have help quite significantly, but there are still some drift issues that need to be addressed.

This week I also helped Anthony work on sending our data over via Bluetooth. As of now, our Android phones seem to be able to display our hardware data properly. There are some issues with sending the data over as data needs to be sent as a string but then we need to reparse it to a float value to use.

I also helped work on our final presentation slides that will be presented in class next week.

As of now, I am technically on schedule because everything is integrated and has been fine tuned a bit. However, our testing and validation results are not as flushed out as we would have liked so we need to spend some more time to get more concrete numbers and stats.

Next week will be spent trying to make the pen look more neat by solder it to an Arduino shield as well as writing our final report, preparing for the final demo, and making our demo video. Additionally, if there is still time, I would like to try to fine tune our IMU position more to see if we can get it a little less finicky.

Sophia’s Status Report 4/8

This week, I focused on hardware and software communication for the purpose of our interim demo. We were able to get the hardware communication with the software working serially through a USB cable on desktop Unity with Uduino. We additionally were able to test it and get Wi-Fi communication working as well. However, the moment we tried to port it to Android phone, the app was unable to detect the hardware pen prototype. As such, we have began discussing alternatives such as trying Bluetooth communication instead and writing our own network socket. We have additionally looked into other packages we could potentially use, but writing our own likely will be the best idea. Our schedule has been updated so we are currently on track. If we keep up the pace and dedicate some more time outside of class and potentially on the weekends, we are hopefully on track to reach MVP and have something cool to show for our final demo. 

Next week, we will spend some more time trying to troubleshoot the Wifi communication once it has been ported to Android phone. If that doesn’t work too well, I will assist Anthony to hopefully get Bluetooth communication working well. Additionally, I will be working on the calibration of our pen to hopefully get better data points that can be sent to the Unity App as well. We have not done any specific testing yet, but as soon as integration has been complete, we will start running the line tests that we have described in our design review. 

Sophia’s Status Report 4/1

The beginning of this week was spent trying to figure out the how to determine distances from the raw IMU data. Instead of using the orientation vector, which was causing issues as discussed in my previous status report, I have decided to use the gryo parameter instead.  I have found a couple tutorials on how to incorporate a gyro parameter in order to potentially get better rotational orientation. As of now, from the acceleration data, I “double integrate”  by multiplying the acceleration vector by the time stamp. From this, I generate positional (x, y, z) coordinates. I spent the rest of the week working with Anthony on our hardware and software communication with Uduino. I wrote a very simple test script that tries to send “Hello” to Unity, but there were a lot of struggles trying to get Unity to work in the first place. The testing for this communication will need to be carried over into next week. Below is our code repo that we will be updating with our scripts.

https://github.com/murrowow/18500-capstone-hardware

As of now, we are technically ahead of schedule since integration between hardware and software is a little later. However, the hardware itself is still not fully able to provide extremely accurate distance coordinates. This updating and tweaking of the hardware is meant to occur this week and up to the next week.

Next week will be spent flushing out Uduino for our hardware software communication and figuring our gyroscope in order to get more accurate positional coordinates.

Team’s Status Report 03/25

We were finally able to purchase the Uduino software package with help from ECE purchasing. We also finally were able to obtain our additional Android phones, one borrowed and one purchased, for the purposes of testing.

Sophia’s Status Report 03/25

This week was dedicated to our ethics discussions. It was a useful experience as it definitely helped open my eyes and see some ethical considerations we hadn’t thought of before. In terms of more technical work, I spent some time coding up our IMU script. I have settled on using the wrapper functions rather than tinkering with the raw data as the raw data is formatted in a way that parsing through it manually myself will introduce a bunch of unnecessary overhead and make it difficult to hit our proposed latency goals. I have been struggling a little bit to get the rotation vector working properly since all provided angles are positive. It has then been difficult to differentiate a tilt from the left versus a tilt from the right. Its unclear if this will cause future issues for us when it comes to calculating line position so I want to debug this as thoroughly as possible before reaching that point. I have also finally been able to purchase Uduino after help from the ECE purchasing. I have not yet tested how well Uduino works with our NodeMCU board.

I am still slightly behind schedule because by now I should have the double integration script working. However, the next few weeks have been allotted for debugging and fine-turning, which is technically what I am working on right now. I will be working on this a little bit over the weekend to hopefully figure out this little bump and be right on schedule.

Next week will be spent fine tuning our IMU script and trying to test its integration with Uduino to send data over to our Android devices.

Sophia’s Status Report 03/18

This week, I spent some time working on the integration of the IMU data. It was a little bit more finicky than I thought especially because it took a while to understand the sensor_event_type and its different attributes. I also experimented with not using the Arduino wrapper and directly pulling out the raw data. However, the raw data was extremely confusing to use and work with so I have decided to use the wrapper function. I have also worked on ordering the Uduino package. However, due to issues with purchasing software packages, the ordering for it has been delayed and moved higher up into administration.

I was out of town traveling so I did not finish everything I wanted to do before I left. Due to this, I am also a little bit behind on schedule. This just means, however, that I will need to do some more work over the week end to compensate for this.

Next week I hope that double integration script will be finalized so I can do some tests, debug, and fine tune it then.

Team Status Report 03/04

For this week the group mainly worked on finalizing the submission for our Design report, which ended up taking a lot more time that was originally expected. Each group member, in addition to focusing on the design report, was able to make some minimal progress on each part they were in charge of for the project.

So far one of the main risks to our project would simply be if current development of each part of our project were to failed to be integrated once each group member is able to complete their work. This should be mitigated with the constant communication we plan to have as a group as we develop for our project.

A slight change was made to our design, where we were able to discover that we would be able to integrate our NodeMCU with Unity using a package called Udunio. This was originally considered, but it did not seem like it would suit our needs, so we had planned on developing a python script to transfer data from our drawing device to the unity app. Upon further research and advice that having our own Python net socket would add too much overhead, we have decided that Udunio features the functions we desired. The only real cost involved is that the package is not free, and time will be taken trying to figure out how to purchase and use it, but this should be minor.

Due to feedback, some of our timeline has been scaled back slightly, mainly hardware, which was initially considered to be too ambitious. We have simply provided more time for each portion of our project and have left integration for a slightly later date then initially anticipated. We hope to thoroughly debug our individual components so we can hopefully reduce and prevent the number of issues that come up during integration.

Overall, the tools we plan to use in our design have not changed from prior iterations since we began planning, besides the change that was previously mentioned. In general it has been determined that the team will be using:

  • Unity will be the main software we will use to develop the app,
  • AR Foundation packages will be needed for AR features used
  • Uduino to send data from our drawing device to our application
  • IMU to detect motion of our drawing device
  • NodeMCU in order to track data that will come from using the drawing device ( when IMU detects movement, when buttons are pushed, make LED light turn on, etc.)
  • Android devices which will be what we will be building for

Sophia’s Status Report 03/04

This week was mainly spent working on the design report. It took a lot more time than we anticipated, which meant there was less progress than I hoped in terms of programming the hardware. Nonetheless, I spent some time trying to program the NodeMCU mode so it can interact as planned between the IMU, buttons, and LED.  Preliminary tests were done with the IMU to make sure that it was able to detect tilt and motion through changes of velocity and acceleration. Additionally, I programmed simple functionality so that the NodeMCU can detect when the button is being pressed and being able to turn the LED on/off.

Due to feedback from previous presentations about our schedule being too ambitious, I set back the schedule for hardware a little bit. With this new adjusted schedule, I am on track.

The next steps are to start working on double integrating the IMU data to see how accurate the distance measurements are. I also need to purchase the Unity package Uduino so we can do some preliminary testing of our data transferring to help prevent future issues in integration.