Steven Zeng’s Status Report for 03/02/24 and 03/09/24

I wrote about section A for the team report; below is my answer.

With consideration of global factors, NutrientMatch can be used by just about anyone across the globe, whether or not their wellness goal consist of consuming more calories or to better keep track of the types of foods they are consuming daily. Regarding younger adults, as fitness is becoming a bigger trend with the emergence of its technology and the expansion of gyms, more of these people are aware of their nutrient intake. Our product is heavily targeted towards this group since they tend to live with roommates as opposed to other family members. With this in mind, it is easier to get foods mixed up when groceries are purchased separately. NutrientMatch is able to track each user’s food inventory as soon as they are logged to avoid these confusions. On the other hand, family members may also want to use this product to track their own intake. Our product can also be used by those who are not as tech savvy as others. The usage process is easy: after each food item is either scanned or classified and weighed, it is automatically forwarded to the cloud database that will make backend calculations to be displayed on the webpage for users to see. Hence, while NutrientMatch is not a necessity in peoples’ lives, the increasing trend of physical wellness makes it a more desirable product overall.

Regarding my progress, I primarily did work before Spring break, and used the designated slack time to relax and clear my head. I focused most of my effort on the design report including analysis of design trade-offs for the ML algorithms. Recently, I have tested soft-margin SVMs to distinguish between fruits and canned foods. I have computed optimal parameters and hyper-parameters using 5-fold cross validation due to a limited data set. It correctly classifies objects only around 74 percent of the time. However, I expect this number to go up as I tune the algorithm to only address the three types of fruits (banana, orange, and apple) and canned foods.

Furthermore, I did more research and experiments on ChatGPT4 and its API to read labels. The results were quite promising, and our focus will be on making sure the image is clear enough for ChatGPT to process. Likewise, I am experimenting with database storage and narrowing down what exactly our group wants to store in the database logs. The current plan is to take a picture of the image then pass it into the classification algorithm. If it is a fruit, our product will directly store the name of the fruit using the GoogLeNet classification algorithm along with caloric information calculated using the weight and online data. If it is canned food, our algorithm will read the label from the front and store the name into the database alongside an image of the back with the nutrition label. This will reduce the latency of storing an object into our database and offload the extra computation to database retrieval in which the image will need to be processed. The other alternative is to do all the computation and calculations when the object is being scanned. This will lead to better memory utilization but worse latency which will be an issue in terms of user experience. We plan to test both these approaches out and determine the differences in latency and memory utilization.

Lastly, I read up on a lot of the work Surya has done with the Arduino as well as cloud deployment. The ML portion is highly dependent on retrieving information from the Arduino (weight and image retrieval) and the ability of an Amazon EC2 server to efficiently run all the computations to populate the database. This involved reading a lot of documentation online to prepare myself for future stages. A lot of the detail and info came from https://aws.amazon.com/blogs/machine-learning/category/compute/amazon-ec2/ which I highly recommend. It gives a lot of fascinating stories and helped me understand the space better.

Overall, I had a relatively productive two weeks and got a good break to clear my head.

Leave a Reply

Your email address will not be published. Required fields are marked *