Joanne’s Status Report for 2/26/2022

This week I worked on with my group more design related portions of our project. We finished up the design review slides and started thinking about the design review paper. We also got some sensors in this week and have been testing them out.

I continued to work on the Unity – Web application portion of the project. Last week I worked solely on Unity to see if I can take in dummy data and make changes to the model (i.e. rotation, moving). These changes would be reflective of what gesture the user will be making on our trackpad portion of our project. We decided that the flow of our project would be that the Jetson Nano will send over data to our web application, then the web application will communicate via Javascript to our Unity application that is embedded inside our web app.

I created the web application in Django which will host our web application portion of our project. Then I embedded the Unity application from last week onto the Django web app. I wanted to figure out how to send serialized data from the web application to Unity. This is so that when we are able to get data from the sensors to the web app, we are able to send that information to Unity to reflect the changes. I researched into how Unity communicates with a web application (specifically django). So far, if i press a button which represent either moving up, down, left, or right, that will call the appropriate Unity functions I wrote to move the 3d model left, up, right,  or down. Thus we are able to get basic serial data from a html page to the Unity application.

This week, I realized that the way I implemented rotation last week, requires the use of certain Unity functions that track the mouse deltas for you. The only problem is that the function takes in input whenever the user makes a mouse drag and calculates the mouse delta, however we need to be able to make rotations with data sets (x,y coordinates of sensors). I started to look into how we can replicate the effect of this function using just x, y positions that we get from dummy data (which will later be sensor x,y coordinates).

For the upcoming week, I am planning to write a script that will send dummy data via wifi to the web application and perform translations to the 3d model. I am not 100% sure, but I am thinking about also using the MQTT protocol much like Edward and Anushka did for the sending of data from sensors to their visual application. I will also work on looking into rotations algorithm more this week.

 

Team Status Report for 2.26.2022

We delivered our design review presentation on Wednesday. Overall, it went pretty well. Creating the presentation gave us a lot of insight into what the next few weeks are going to look like, and we discussed our schedule more in-depth after the presentation to cover ground in areas that we still need to work on.

The PCB board is being manufactured and will be shipped next week, and we’re currently testing individual sensors to see how they behave. Additionally, we decided on using MQTT communication since it was the most common IoT communication used, especially with the Photon Particle. We began testing the sensors and developed a visual interface using JavaScript and HTML to see how the sensors detect objects close by. For the most part, we’re able to tell what sensors are going off when a finger comes close by, and we can also almost guess which direction we are swiping. This is a good indication that our rotation gesture recognition algorithm may actually work.

We also started the web application portion of our project. We have figured out how to embed the Unity application onto our Django web app and communicate data between our Web application and Unity.

Together, we worked on brainstorming how the web interface will look like and how it will be implemented. Joanne will primarily be working on getting a demo of that working soon. Edward will test out the PCBs once they are shipped. Anushka will work on getting the Jetson to boot and run code. We seem to be on schedule this week.

An issue that arouse was that the VL6180X sensor can read data in a 25 degree cone, and when a finger gets too far away from the sensors (75-150mm range), all the sensors will say that they see an object. We have no idea how this will look like exactly, so we created a visualization to see what would happen if a finger gets too far away. Would the finger look like a line? Would it have noticeable dip? We will need to explore this once the PCB is fully done and verified, which might take a while and put us behind schedule.

Edward’s Status Report for February 26

This week, I worked on firmware for our wearable. I wrote code for our microcontroller, which is a WiFi-enabled Particle Photon MCU to connect to a public MQTT network and publish sensor data. I actually spent a lot of my time trying to get it to work on campus WiFi. At first, I was unable to get it to connect reliably to CMU-DEVICE, but, after futzing around with it for a couple hours, I tried out another board and it seemed to work fine.
The Photon publishes data formatted like so, in a packed byte array format:
[<4 byte little endian timestamp>, sensor value 1, sensor value 2, sensor value 3]

I hiccup I ran into is that our board cannot get a unix timestamp with millisecond accuracy, so measuring the latency of our data collection will be incredibly off and inaccurate. Might need to figure out a way around this. However, technically, this is only for measuring data for metrics, and not essential to the actual functionality of the device. So, I may need to consider the tradeoffs of taking time to fix this problem or just ignoring it and getting an inaccurate latency measure.

I also worked with Anushka to get a visualization of the sensor data working so that we can understand what kind of data we are dealing with (see video here). We have a set up with three sensors attached to a Photon that publishes data to MQTT. The webpage gets data from MQTT and displays it. I believe visualization will help guide us in what a “finger” looks like in terms of sensor data.

Additionally, I ordered the PCBs earlier this week and they should come next Tuesday (3/1). Next week, I will be working on testing and verifying that the PCBs work as intended. This will probably take up a large chunk of my time next week.

I am on track this week, but as soon as the PCBs come, I might fall behind, since they might not work… I want to get the PCBs verified before Spring Break starts.

Anushka’s Status Report for 2/26

This week was heavily design focus. I was responsible for giving the Design Review presentation, so I spent most of last weekend researching and getting the presentation together. It was very helpful doing this presentation because it helped show me what parts of the project we have more planned than others. For example, the “middleware” part of the project, i.e. the Jetson and the communication part, is where we have the most questions surrounding the specs and how/if it’ll fit in the final scope of our project. However, we have the most information on the sensors, which we have already ordered. I have a clearer understanding of the direction of our capstone and what areas I need to explore more.

Speaking of areas I need to learn more about, I continued tinkering with the Jetson this week. It’s still behaving very spottedly, and I predict it has something to do with how I downloaded the image on the SD card. As mentioned before, this is the weakest part of our project, so I need to spend more time on next week. My goal before Spring Break is to get it set up, collect data from the sensors, and send information to the web application. It’s a lot, but if I make this my sole focus, it’ll be good for the project as a whole after we come back.

We ordered individual sensors to test out while we wait for the PCB board to be manufactured. We began testing it and decided to create a visual interface of it. The idea that I had was that the yellow dots represent the sensors and the blue dot represents the objects that are being detected.

I originally wanted to create this web application in Django, so I told Edward to code the MQTT communication in Python. However, since the CSS of the “objects” need to be changed, and the only time it would update is when the server would refresh, we decided to change our web application to be all on the client side. I asked Edward to change the code into JavaScript, and then we were able to get an almost instantaneous rendering of the sensor data. A demo is attached to the link below.

https://drive.google.com/file/d/1w0-5nBngHThfDe-A_Iem-NNt1k3NcuCe/view?usp=sharing

For the next week, I will focus on the Jetson and finishing the design review paper. If I have some time after these two, I will finetune our visualization so that it can be used as a dashboard to measure the latency and accuracy of the sensors. However, this may become a Spring Break or after Spring Break issue.

As mentioned before, the Jetson is the weakest part and the greatest risk. Although we have a contingency plan if the Jetson doesn’t work, I don’t see a reason why we have to remove it at the moment. I will reach out to my teammates and the professors on Monday so that we can get over this hump. We’re still pretty much on schedule, since we are getting a lot of work done before the actual sensors get shipped.

Team Status Report for 2.19.2022

This week as a group we primarily discussed future design goals. Some major topics that we focused on was how we would write the algorithm for the gesture recognition, more precise testing measures, a more in depth flow chart of our device. More design details that we discussed will be on our design presentation slides. Afterwards we worked on the design presentation slides. We also got samples of the sensor that will be on our custom PCB so we can start working on getting sensor data to the Jetson and then to our web application this week. We placed the order for our custom PCB that Edward designed this week as well.

Individually we worked on 3 different areas of our project: the PCB board to collect sensor data, the Jetson Nano, and the 3d modeling interface (Unity Web application).

Joanne worked on getting the Unity section. She scripted one of the three gestures the model should perform when a user inputs our target gesture. We got the rotation gesture (which is the swiping gesture from user) to work. The translations work well when projected onto the holograph pyramid as well. She will be working on finishing up the other two gestures (zooming in and out) this week as well.

Edward worked on finalizing the PCB and testing out the VL6180 distance sensors. He made a small testbench using three breakout boards we bought. The sensors seem fairly sensitive and a pretty fit for our use case. They detect up to around 150-200mm, which fits on a human forearm. Honestly, maybe with some clever math, we can detect finger movements with just three sensors and might not need all 10 sensors. So if the PCB ends up not working out, it might be ok to just use the breakout boards? Will need to explore this further next week.

Anushka worked on setting up the Jetson Nano and looking into communication protocols to implement. This will be helpful later this week once we work on sensor communication with our test sensors. She will be using this information to guide the web application interface throughout this week. She will also be presenting the design review at the end of the week.

The schedule has moved around a lot since last week, and we’ve updated our current Gantt chart to reflect that, as seen below. The team will meet next week to discuss these updates.

Schedule Updates

We haven’t made any major design changes since last week, but we did flesh out what we want the user experience to look like. We will be discussing these during the design review this week. The most significant risk is how the board will work, but this is why we ordered it now so that we have some time if we have to make any changes to the PCB. We will test as soon as the PCB comes, but we purchased single sensors in the meantime so that we can test the general behavior of the distance sensors.

Anushka’s Status Report for February 19

This week, my focus was on a myriad of tasks rather than focusing on a specific dimension. This was prompted by our earlier purvey into the hologram pyramids; even though we aren’t scheduled to work on it until much later, I wanted to look into web application interface and the Jetson communication.

With a quick look into the holograms last week, after discussions with the professor and the TA, we are considering making the hologram pyramid bigger so that there is more surface area. I had an idea of cutting square pieces of acrylic, decreasing the angles of the trapezoid, and testing.

Looking into the Pepper’s Ghost illusion, the original use of it was in magic show’s. The original projections are similar to the ones in the diagram shown below. That projection is horizontal, and there is only one glass available. I may brainstorm ideas on how to execute this, since most people have access to projectors and can easily replicate this instead of needing to use a big screen and a pyramid.

​​

Image from Wikipedia: https://en.wikipedia.org/wiki/Pepper%27s_ghost#/media/File:Peppers_ghost_low_angle.jpg

I also began looking into Jetson. Setting it up is similar to setting a Raspberry Pi, but I was having some trouble formatting the SD card. I also began researching different communication protocols, including MQTT. I want to follow this tutorial once I set up the Jetson to see if we can use it in our final project: https://www.youtube.com/watch?v=WPmzoYwXj00&ab_channel=AshutoshMohanty.

Image of Jetson

We have a design presentation due next week, and I volunteered to present this time. I want to focus on making our presentation more pitch-focused, so I began storyboarding how I imagine the user will be using it. Some screenshots are below. For the presentation, I also worked on a few slides on the wrist watch CAD file and the gesture algorithm, which can be seen under the Design Review tab above.

Screenshots of the user experience

Since we are moving around the different parts of the schedule, I don’t believe I am behind. If anything, starting on these parts means that we are ahead, but I have to continue working on them in order to continue to hold this position. Over the next week, I want to do more work on the Jetson, since this will be the most relevant once the sensors are completed. Since we ordered some sensors to test with, we can start with communication at the end of next week with the Jetson, so I’ll make that my deadline.

Joanne’s Status Report for 2.19.2022

This week I worked more on the Unity portion of our project. I am working on mimicking the translations that would be applied to the 3d model after a user produces a gesture (i.e. zooming in/out, swiping). I initially wrote script to get 3d object to move on its x,y,z coordinates when a user does specific key presses. This was to ensure that the object responds to the key presses and that the projected image also translates well. Next I got the swiping motion which translates to rotation of the object to work. The script currently takes x,y coordinates from the mouse cursor location and will rotate the 3d object. I am planning to work on the last gesture (zooming in/out) this week. I have been discussing with Anushka (who is working on the jetson nano part) about how the data should be serialized (i.e. what kind of data should be sent and in which format) when sent from the Jetson to the web application.

I worked on some portions of the design slides and talked as a group about our future design plans.

Edward’s Status Report for February 19

This week, I finalized the PCB’s (should come late next week or early the week after). Turns out I was generating gerber files that did not include actual drill holes… I also removed parts that were not available in the board house with two through holes, so we can hand solder the missing parts. We may need to buy a 0.1uF capacitor and solder that on, which shouldn’t be too bad. I’m a bit scared that the support components PCB will end up not working on the first try, but we can always iterate.


Sensor array PCB rev. 001


“Support” PCB rev. 001

I also bought some VL6180X sensor breakout boards from Amazon. These boards are the boards I based my PCB design off of.  I played around with the sensors in Arduino with a Particle Photon as a preliminary test. I couldn’t get the IDE to recognize my ESP32. The Photon has WiFi capabilities, but not BLE, but that is probably ok. The sensors seem to be fairly sensitive and have a good range. Each sensor maybe has a ~1-2mm margin of error in relation to the other sensors, which isn’t too bad. I tested this by placing a piece of paper in front of three sensors hooked up to my Photon and recorded the readings. I implemented sensor reading in a round robin fashion due to IR interference, since each sensor records readings within a 25º (+/- 5°) cone (according to datasheet) and other sensors can interfere when one sensors is reading. I also added a small averaging filter to the readings.


Sensor testbed

I am on schedule for this week and will work on communication stuff next week with the rest of the team.

Team Status Report for 2.12.2022

Welcome to our project website! We’re so excited to share everything we are doing.

This week we focused on initial designs of different components. We focused on the PCB boards, Unity, and hologram pyramid. For the PCB board, we designed and are ready to order, but we also decided to place an order for two sensor breakout boards so that we can start testing our measurements as soon as possible since that order will arrive most likely before our boards do. The Unity environment was intuitive to work with, and we were able to set up an environment that we needed for the hologram quickly. We created the layout necessary to project our 3d model onto our hologram pyramid. However, our biggest challenge this week was with the results of the hologram pyramid. 

After tinkering with the hologram, we are concerned about the scope of the component. We aren’t satisfied with the projection of the 3D image and we are considering either improving it or replacing it with another medium altogether. We have placed a demo video link of the hologram pyramid: https://drive.google.com/file/d/1ip7AXfh7wN9jhmTKIRnzeADvgepjKqEY/view?usp=sharing. We will make a decision on what to do after our testing this week. Since we are working on this component earlier than planned, we have enough time to make modifications and it will not serve as a risk to our project at the moment. This may change our requirements, but we cannot make that decision at this time. 

Since we were prototyping with multiple pyramids, we had to incur the cost of the plastic. It isn’t a huge cost, since our overall projected budget is low, but we do not want to spend too much just cutting new pyramids after every modification. We will try to do more research ahead of time on what dimensions serve best before cutting more pyramids.

 

Next week, we want to accomplish several things, and below is our modified schedule. We decided that we’re going to parallely work on parts to prevent us from scrambling at the end.

  1. We’re going to be working on improving the hologram and seeing if we can get the image to be continuous across an axis instead of disjoint.
  2. We’re going to be placing the PCB order as soon as we get it approved.
  3. We’ll continue tinkering with Unity and exploring how external inputs can be applied to objects.
  4. We are planning to try out a different size design for the hologram pyramid to see if it improves the quality of 3d model viewing. 

Modified Gantt Chart with this week’s tasks and progress

Joanne’s Status Report for 2.12.2022

This past week I worked on making the proposal slides with my team and I presented it during the proposal review. We got back feedback from our classmates and TA’s. One of the feedback was thinking about how we were going to implement the hologram visual portion of our project.

Thus for the latter part of the project, I worked on Unity to create the layout of the hologram. In order to project a hologram onto our plexiglass pyramid, we needed to place four perspectives of a 3d model around the base of the plexiglass pyramid. My main responsibility was to use Unity to create the four perspective view of a random 3 model and then export it to the web using Webgl. I placed a photo of the unity scene when it is running below. The white lines are drawn in by me to illustrate how the pyramid would be placed in respect to this view.

I exported the Unity scene onto a web browser so we can place the pyramid on top of my ipad (which holds now the link to the unity scene with the four views of a model). Edward and I helped Anushka in laser cutting the pyramid plexiglass she designed. An image of a basic hologram model that we completed is on our team status report page for this week.