Team Status Report for 3/29

The biggest risk currently is that we found out that the main camera module we purchased seems to be broken, which means we need to get a replacement working as soon as possible. Our backup camera is detecting, but is acting very weird, taking extremely blurry photos of a certain solid color, and the images are completely unusable. We’re currently planning to use a simple usb webcam temporarily for the demo and to get a working replacement.

We have made no changes to our design, or anything major to our schedule, since we expect getting the camera working in our code once we have a working one will be relatively quick. The change that was made to our schedule was mainly working around the faulty cameras, but after our problem is addressed, should be ok.

Outside of the camera issues, we have gotten the rest of the pipeline operational and tested, meaning it is ready for the demo and MVP.

Richard’s status report for 3/29

This week, I worked on debugging the camera with Eric and verifying the functionality of everything else. We tried many troubleshooting methods on both our camera module 3 and the backup arducam camera. However, no matter what we did we could not get the camera module 3 to detect and take pictures. The backup camera does detect, but would only take extremely blurry pictures of weird colors, such as blue and pink despite facing a white wall. Ultimately, we decided to move forward with a usb webcam Tzen-Chuen already had for demos. Outside the camera, mostly everything else was tested to see if it ran smoothly. The whole pipeline from image capture (simulated with just a downloaded image file on the raspberry pi) to being verified as a match was tested and worked within timing requirements. In addition, I fixed some bugs in our website such as images not displaying in fullscreen properly. The link is the same as the previous week. 

Outside of the camera issues, my progress has been on schedule. Since we don’t think putting the camera into our pipeline is much of an issue once we have a working one, we don’t expect this to delay much. By next week, I hope to add a working camera to our pipeline and then work on the sagemaker implementation with Eric.

Eric’s Status Report for 3/29/25

This week, I worked on adding permissions to the front end to handle different users and permissions, but there are still bugs with the login settings that need to be resolved. This is less of a priority for the demo next week, though. I also worked with Richard to continue debugging the Camera Module 3 and concluded that the Raspberry Pi is not detecting the camera due to a hardware issue with the camera module itself. We decided to go with a simple usb-connected webcam for the demo. Additionally, Richard and I tested sending matches to the Supabase database from the RPI and confirmed that matches are successfully added to the tables. Progress is a bit behind schedule due to the broken camera, and debugging the Camera Module 3 took much more time than expected since it wasn’t a software issue. Next week, I plan to fix the login bugs, get replacement cameras, and continue expanding Supabase integration with sagemaker, which isn’t part of the MVP.

Tzen-Chuen’s Status Report for 3/29

With the demo being Monday, we have all systems working right now besides the camera, which is alarming to say the least. The camera that we originally intended to use, the camera module 3 was being programmed and almost done, with pictures being taken and everything. Our backup camera is working though, but the problem with that is that it is a much more advanced camera, needing lots of configuration.

So right now we are working on our third alternative, a usb webcam that is acting as a substitute, which should need much less configuring than the arducam first backup. While hairy, we are on track to have a successful demo.

Team status report for 3/22

A significant risk that could jeopardize the success of the project is currently the camera and the pi. We were able to take images, testing baseline functionality, but when we moved our setup, the camera stopped being detected. Though this can probably be resolved, the impact to our timeline will still be felt. If it is the camera, pi, or cable that is the issue, we need to diagnose soon and order replacements from inventory or the web. 

A minor change was made to the system design by replacing the HTTP POST Supabase edge function with event triggers. This change was necessary because the Raspberry Pi already inserts data directly into the possible_matches table, and may result in more complexity compared to the edge function method.

Right now, our locally run code is finalized outside of the camera problems and can be found here. We also have a working prototype for the website for law enforcement to view matches, and can also be found here.

Richard’s Status Report for 3/22

This week I focused on finalizing the locally run code. I added the queue functionality where if a match cannot be sent up to the database for whatever reason, such as loss of internet connection, the data will be saved locally, and it will be retried later. I have also written a main function that will take a picture and send matches every 40 seconds, check for database updates every 200 seconds, and retry failed entries to the database every 200 seconds. Due to some problems we have with initializing the camera, that is the only aspect of the locally run code we do not have working right now. The code can be found here. In addition to the locally run code, I have made a prototype for the front-end website using lovable for law enforcement to look for matches. They can see the currently active amber alerts, possible matches sent to the database, and verified matches where the larger models on the cloud confirm the result of the possible matches. It also has useful filtering options such as filtering by confidence level or license plate number. The website can be found here.

My progress is on schedule. By next week, I hope to implement the verification process of possible matches into verified matches with Eric, and have an MVP.

Tzen-Chuen’s Status Report for 3/22

Ideally I would have been finishing the integration of UI, raspberry pi, and physical components, but I’ve been stuck on getting the main camera program working. The issue is that the camera appears to be plugged in, but is not detected by the computer. This is strange as before it was working just fine and we were able to take pictures with it.

Aside from the camera, much of the work this week was the ethics assignments, as it was a really unexpected time sink. Currently I need to solve the camera connection issue as soon as possible, and get back to work on other parts of the project where I’m needed like the UI.

Progress is currently behind schedule because of the camera hiccups and the ethics work, but with a bit of extra elbow grease next week we should be back on track. (Hopefully.)

Eric’s Status Report for 3/22/25

This week, I intended to continue testing the end-to-end data upload flow from the Raspberry Pi to Supabase. However, testing was temporarily blocked due to some dependency issues on the RPi, which prevented full integration with the latest version of the upload scripts. In the meantime, I focused on other tasks to continue making progress. I refactored the system architecture to use Supabase database event triggers instead of HTTP POST-based Edge Functions. Since the RPi inserts data directly into the possible_matches table, event triggers should be used instead, and worked out more details of how the parts will communicate with Richard to minimize issues we’ll face during integration. I also added functionality on the database side to send active alerts to the RPi.

Although testing on the RPi is slightly behind schedule due to the dependency issues, we plan to resolve that as soon as possible and begin end-to-end testing. Next week, I aim to begin implementing the Sagemaker api with the Supabase code and start full testing.

Team Status Report for 3/15

From the feedback received for our design report, our current risks are to figure out the cloud implementation in more detail and have something we can use to integrate with the Edge compute part. Additionally, we need to test the camera to see if it matches what we need as soon as possible. To that end, we have gotten a barebones and minimal but still usable cloud implementation for our case, and Tzen-Chuen will be hooking up the camera in the next couple days.

Currently, there are no changes to the existing design. We forecast changes next week however, as the major components should be integrated and we will begin testing our MVP, likely learning what could be improved. The current schedule is MVP testing next week, with working on new/replacement components/making the requisite changes the week after that. 

Right now, we have the locally run code mostly finished, and it can be found here. We also have made large strides on the cloud side of things, with databases set up.

Richard’s Status Report for 3/15

This week I focused on the integration between the Raspberry Pi and the cloud.  I worked on creating the database update code, where a Python function will retrieve the latest version of the Amber Alert database from the supabase and update the locally saved database with it. If the Raspberry Pi does not have an internet connection, it will simply continue using the locally saved version.  I also wrote the code to check for matches in the database. A loop checks the OCRed text with the database and if a match is found, the original image is uploaded to a supabase bucket, and an entry is made into a “possible matches” table for further verification by the cloud models. We will later integrate Eric’s edge function to move more processing to the cloud. The updated code can be found here.

My progress is on schedule. By next week, I hope to finalize the locally run code and work on the edge function that should run when a match is uploaded to corroborate with larger models.