Rebecca’s Status Report for March 29, 2025

Report

I soldered pins into the I2C GPIOs on the Rasppi boards to make accessing them simpler. With a steadier metallic connection I was able to test the Python version of the I2C library and got it to work as well, which makes wrapping the core code on each of the boards in the communication harness necessary much simpler since it’s all in the same language (and also I have an example of it working events-based, instead of just polling on every loop, so I don’t have to fight through figuring out events-based code in C++). I measured the current draw of each of the Rasppis running their core code, so I know how heavy of a battery I need to purchase, and it actually turned out to be a little less than I expected it to be. 1200mAh should do it; I’ve put in an order for a 3.7V LiPo that size (I think? This week has been a little hazy in general. If not I’ll do it tomorrow, either way it should get ordered early next week) and I have a 1200mAh LiPo battery on hand from a personal project that I can start to work with and wire things to on a temporary basis before it arrives.

Also the display arrived! As I understand it it arrived on time (March 22) but I didn’t actually get the ticket from receiving until much later in the week since the original one went to the wrong email (one attached to my name but on the wrong domain, and which I thought had been closed. Oops). But I have it now. It’s here! I feel so much less terrified of this thing now that it’s here! I need to get my hands on a reflective surface (will probably just order little mirror tiles and cut them to size, or a reflection tape that I can stick to the same sort of acrylic plastic that I’m going to cut the lenses out of. Gonna see what’s cheaper/faster/etc on Amazon).

I modified the draft of the CAD so I’ll be able to mount the Rasppis and the camera to it for interim demo. I ran out of time to do anything else, because the rest of the things are more complicated and the display came too late in the week for me to fight with it this week.

Progress Schedule

Things are getting done. I don’t know. It’s Carnival week and I am coming apart at the seams. I will reevaluate after next Sunday.

Next Week’s Deliverables

Interim demo. That’s it. I plan on printing the headset and attaching the boards I have tomorrow, and then I’ll have the wire lengths to connect the I2C pins. It’s gonna. get done

 

Charvi’s Status Report for 3/29/25

This week, our team got some feedback about lack of complexity, specifically on my end as the webapp wasn’t enough complexity wise. As a result, we reshuffled some responsibilities and assigned new ones. I finished what I planned to do, here is a summary:

I completley developed the pygame display which will be displayed on our glasses, including all the instructions (go forward, go back, show ingredients, elaborate steps). This information will be displayed at all times to the user as a HUD, so I made sure to take extra consideration to provide text wrapping, intuitive controls (ex. locking progression on steps when ingredients are up to reduce amount of input confusion, making sure “back” returns to the beginning of the previous step and not the wrapped text), and easy to edit variables for line amount and textbox sizes in case things change when the program is rendered on the actual display. I also added a progress bar that displays at the bottom of the glasses to show how many steps have been completed. I used arbitrary keyboard inputs as placholders, which Diya then attached to the gesture recognition signals. The display output is now fully hooked up to the gesture recognition.

Tommorow, I need to add the functionality of recipe experience levels (exp gained through finishing recipes) and display them on the profiles on the webapp. Diya is currently experiencing some git issues which have temporarily slowed down progress as integration of the changes on her branch are not merging with the others (something to do with virtual environment issues), but we are resolving that and then I will implement this functionality.

We also discussed what we can implement after the demo, and had a discussion about what exactly we want to do with the networking feature.

I had the idea that to integrate the glasses and the webapp networking features more, we should have a feature that allows the user to pick one other person to cook with before starting the recipe, and once it starts, should be able to see their progress on the recipe live on the glasses. This will require some lower level work with WiFi connections / networking, and will also require some work on the arduino and I2C- so I’m excited to work on this after the interim demo.

EDIT: upon discussion today (sunday march 30th), we have decided that an analytics feature may be more useful and focused on the user requirements and use case. More on this in the next report.

In addition, if the display is able to handle the current display, Diya and I thought it would also be cool to be able to display the recipe selection on the glasses. I will be in charge of this if that happens, which will be another task for after interim demo.

Team Status Report for March 22, 2025

Project Risks and Mitigation Strategies

Everything, as it sort of always does, takes longer than you think it does. With the interim demo and Carnival both fast approaching, the sheer amount of time that we possibly have to sink into this project is becoming very limited, and we’re beginning to really consider what are the most important parts of it that we need to hit as soon as possible, and what can be put off to after the demo or potentially (unfortunately) pitched, without damaging the meat of it.

Changes to System Design

We have not yet committed to it, but we’ve begun discussing seriously (as we recognized some time ago may be necessary) dropping Bluetooth communication in favor of WiFi; we were worried about how power-hungry WiFi is, but since the headset needs to be in contact with the web app at only a few particular points we may be able to mitigate this. WiFi is simpler both for the hardware as it’s already set up on the Rasppis and simply reduces the number of different thing we need to figure out, and for the software, since we know our server host cooperates better with WiFi signals than Bluetooth. This may become a system design change soon.

Schedule Progress

Independent schedule progress is addressed in our individual reports. While the order of some tasks has shuffled, our work is roughly on track.

Web scraping works now, and Diya and Charvi will work together to integrate this into the webapp in the coming week.

Rebecca’s Status Report for March 22, 2025

Report

I’ve got HDMI output working from the Rasppi without a camera. As per the usual, everything that could go wrong did go wrong, and I spent an unfair amount of time troubleshooting. The display was meant to arrive today (March 22) so in theory, if it did, I’ll get word about it from the receiving office on Monday. I’ve got access to a device that takes AV input, so if it isn’t here by then I’ll put in an order for an AV cable, cut an end off it, and solder the free wires directly to the Rasppi’s test pads. Then when I need to hook it up to the final output I can just cut the cable again to get necessary length and bare the other end of the wires. I might end up with a little more insulation than I was expecting, but really I can’t imagine it’ll be anything more than marginal.

I’ve been working today (and will return to working on it after I finish this writeup) on getting the Rasppis to be able to talk to each other over I2C. In theory it’s possible, but since their default settings are strongly weighted toward being I2C masters getting one to act as a slave is proving inconvenient (as per, again, the usual), though every document and forum post I’ve found more recent than 2013 is holding that the hardware is capable of it and the software exists to make it happen. Worst case I resort to using the GPIOs as GPIOs and just manually run a barebones protocol for the communication, which I think should be fine, considering we are not running more than like, a single byte a handful of times a second across the line.

Edit, two hours later: it works!!

Currently the slave device is running C code that consists of an infinite loop constantly monitoring for messages. I’d like to swap this out for Python (for language consistency) that does event monitoring, to reduce the loaded power consumption. The wires between my two boards are NOT soldered in right now, which feels… suboptimal, but hey, whatever works. Or works sometimes, I guess.

Progress Schedule

I’ll do my best to get them talking to each other tonight; if I can’t, the display arriving becomes my real hard deadline. They are talking.

I also really actually need to order the power supply this week. It is still very much on my radar.

Next Week’s Deliverables

If I can catch just a teeny tiny bit of luck at least one of my displays will have actually arrived this weekend and I can pry it apart next week. Then I’ll only be sans the power supply for things I have to order, and can put all of the things together if only powered by my laptop.

Charvi’s Status Report for 3/22/25

This week, I worked on the webapp further.

I have finished functionality for the recipe page, including adding reviews and displaying information about the recipe like ingredients and steps.

I have also finished functionality for the recipe selection page including filtering with tags.

It does not look great, but the functionality works as it should and is pretty complete.

I have currently been testing with manually inputting recipes in a form, which includes inputing name, steps, ingredients, and a tag. This should be pretty easy to migrate to real recipes as I have set up the models so that reading from the JSON file that Diya has scraped from the web into models should allow it to easily update through the whole existing webapp, though I am sure that there will have to be some adjustment.

I have accomplished the goals I set for myself last week, so I would say I am on track.

The next step is spending some time finishing up the recipe running page and adding a score bar (for the player level) on the profile page that updates with completed recipes – this should be quick. I aim to finish this page tommorow, so I can be on track to complete the remaining integration tasks. Then, Diya and I will work on integrating the actual web scraped recipe database into the webapp, which will probably take some time as there will be a few details that have to be adjusted here and there. Once that is done, I can remove all the placeholder data. And lastly, the entire team needs to figure out how to connect this application to the glasses. This includes getting text to display on the glasses, which the three of us will figure out. Once that is done, the webapp portion should be good to go for the interim demo. If there is extra time, Me and/or Diya can work on making the webapp look good (though I will probably spend some time rearranging things to make the website more legible anyways).

 

 

Team Status Report for March 15, 2025

Project Risks and Mitigation Strategies

Since all three of us are very involved in Carnival (booth and buggy), we have decided to set a deadline for completing the work that is necessary for the interim demo by Friday, March 28th. This way, we will be able to ensure that our capstone project is as completed and ready to go as possible before week-of-Carnival responsibilities descend upon us all, as both booth and buggy become extraordinarily time-consuming that week in particular. And worst case, if we are a little behind, we can use the weekend before carnival to finish up any last minute work.

Changes to System Design

One change we decided to make after talking to Gloria was to add a feature for the user on the glasses. This feature is an additional gesture that prompts the display on the HUD of the definition / elaboration of a step which may have field-specific terminology or other words with which beginners may not be familiar. For instance, if the user asks for elaboration upon the step “dice two onions” the display would return new text to explain that dicing means to cut the onion into uniform quarter-inch cubes, and possibly a technique recommendation. This should not be too hard to implement on both the backend and the display, and we felt as if it would be a good way to further utilize the gesture language and serve our users’ requirements better.

Schedule Progress

Independent schedule progress is addressed in our individual reports. While the order of some tasks has shuffled, our work is roughly on track.

Rebecca’s Status Report for March 15, 2025

Report

Changing the WiFi on a Raspberry Pi without entirely rewriting the OS (using the imager) turns out to be a relatively straightforward task, assuming you have current access to the OS. Changing the WiFi on a Raspberry Pi without entirely rewriting the OS while you don’t, i.e., when you’re on the opposite side of the state to the network it’s set up for, is virtually impossible. It didn’t used to be, though- on previous versions of Raspberry Pi OS, pre-Bookworm, it was just a matter of creating a specific file in the boot directory of the SD card and putting the network information there in a specific format. And since so many people in the Rasppi community simply do not like to call out by name the version of the OS they’re working with, it took a frankly unreasonable amount of time to figure out that that method had been deprecated on the version I’m using, and that’s why it wasn’t working. (In fairness, I suppose, the new version of the OS is very new, only a few months at time of writing, so the vast majority of the discussion out there predates it. Unfortunately, the new version of the OS is very new, so the vast majority of the discussion out there predates it!)

CMU-DEVICE requires registry using the device’s hardware address, which is easily identifiable with arp -a on my laptop’s terminal given that it and the Rasppi are on the same network, I know the Rasppi’s IP address, that the two had been recently in contact. What I ended up doing was flashing my second SD card with the WiFi information for my cellphone’s hotspot, connecting both it and my laptop to that hotspot, using an IP scanner to identify the Rasppi’s IP address, pinging it, and then calling arp to get the MAC address. Success. My device is registered with the WiFi! Now how do I get to the WiFi? It’s no longer something stored on the hardware- I need to modify the SD card with all of my work on it from last week without destroying it.

There’s no good way, turns out. I ended up changing the login information of my phone’s hotspot to spoof my home network so the Rasppi would connect to it, then sshing in on that network to use rasppi-config to update the information. It felt very silly, but it worked, so sure! Alright! In retrospect, if I had started by spoofing the old network I could have skipped using the other SD card entirely, so if I have to change the information again going forward that’s the way I’ll do it.

My week has been… nothing short of insane, on account of one specific project I have in another class that ate me alive, so I haven’t gotten a chance to sit down in front of a monitor that takes HDMI or wire up the Rasppis to be able to talk to each other. I’ve done a good bit of research and am pretty sure I know how to make the I2C, HDMI, and AV out work, so RF project willing I’ll be sitting down early this upcoming week to get at least temporary wires running between the Rasppis. I’ll probably have to solder them in, since the boards don’t have pins, but I’m going to try to do the lightest-weight job I can since I’ll have to take it out and redo it eventually. I also realized that I need to get my hands on another USB Micro cable, since I only have the one but will have to test-power both of them at once pretty soon. Gonna ask around to see if anyone I know has one that I can borrow lying around this weekend, then just order one on Amazon early next week if not.

Progress Schedule

Unfortunately the radiofrequency project that came out of left field (I knew it was coming, didn’t expect it to be nearly so insanely difficult as it was) has put me on the back foot with regard to literally everything else. I might have to abbreviate some of the HDMI work I was planning on doing since we are approaching pretty quickly when the displays are going to be delivered. Gonna be playing catch-up this week.

Next Week’s Deliverables

I need to get the boards talking to each other, which may be early this week or may be late depending on whether or not I can get my hands on another USB micro cable quickly or not. Also want to get HDMI out working, since that was supposed to be this week and ended up falling to the wayside.

Diya’s Status Report for 3/15/25

This week, I worked on our ethics assignment, completing the necessary tasks for the assignment and addressing ethical considerations related to our project.

I also spent considerable time researching and learning how to handle specific tasks such as creating .task files for the Raspberry Pi and implementing web scraping techniques. After discussions with Rebecca, we realized integrating gesture recognition onto the Raspberry Pi is more challenging than initially anticipated, mainly due to compatibility issues with .py files. I have begun developing a .task file to resolve this and plan to test it with Rebecca next week.

Additionally, I’ve been exploring web scraping to automate the recipe database, avoiding the manual entry of 100 recipes. I’m currently writing a script for this task and plan to test it this weekend.

Looking ahead, my primary focus for next week will involve testing these implementations. Given the complexity of the integration, I want to ensure that I have enough time for the integration phase to address any blockers that I might run into.

Charvi’s Status Report for 3/15/25

This week, I worked on the ethics assignment and further worked on the webapp.

I have the login and registration completed from before, and I have also worked a lot on the backend this week and done planning of the total website / backend structure.

I have been focusing on the recipe selection page primarily, as well as adding features to the profile page to reflect the changes we made last week.

In addition, after a conversation with staff, we decided to add a feature in which users can request further elaboration on a cooking term in a step that will show up on the display. I’ve accounted for this feature in the webapp structure.

I did not have much time to work on capstone this week as I landed from my flight Monday night and had a large assignment due for another class on Thursday, so I am behind and was not able to complete the functionality of the entire website which was my goal that I assigned for myself last week. However, I do think this was an unrealistic goal, and I think I have more time than I initially thought to work on my end of the project as integration will be happening later than initially planned. In addition, I will have a lot more time to work on capstone this coming week.

This week, I absolutley must complete atleast the recipe selection and recipe running page with steps showing to be on track. I am confident that I can get this done since I have the backend setup and I have more time this week. Once that is done, I will continue working on functionality for the rest of the webapp.

Diya’s Status Report for 08/03/2025

Last week, I focused heavily on the design report, contributing significantly to refining the software details and web application requirements. I worked on structuring and clarifying key aspects of our system to ensure that our implementation aligns with our project goals. A major portion of my work involved ironing out details related to gesture recognition, particularly ensuring it aligns with our defined gesture language. This included adjusting parameters, and troubleshooting inconsistencies to improve accuracy. I have attached a photo of an example of the gesture recognition for the defined gesture language in the design project report.

In the upcoming week, my main focus will be on improving the accuracy of gesture recognition. This will involve fine-tuning detection thresholds, reducing latency, and optimizing the system for different environmental conditions to ensure robustness. I will also continue working on refining the design report if needed and contribute to the integration of the gesture system into the broader application.