Jana’s Status Report for 04/26/2025

This week I got the mister working. I also finalized the data collection of image and sensor data for the ML plant health detection model. In total we now have 746 data points, consisting of 14 plants across 4 species. Based on testing results I made some adjustments to the ML model, mainly fine-tuning the image and sensor models using our dataset before training the late fusion classifier. With these adjustments, I was able to achieve the following results:

True Positive Rate (TPR): 0.9091

True Negative Rate (TNR): 0.9496

False Positive Rate (FPR): 0.0504

False Negative Rate (FNR): 0.0909

Although the results don’t quite meet the requirements I initially set in the use case and design requirements (FPR <5 %), I believe that the model performs reasonably well given dataset and time constraints, so I will be adjusting the design requirements to be FPR and FNR <10%.

Aside from that, Zara and I began testing the overall system on 3 African Violet plants. We placed the plants in the greenhouse system and set it to automatic mode, and have been monitoring the plants daily to ensure that the system works as required (lights turn on according to schedule, heaters/misters/watering work according to PID control, etc.)

I am on schedule, as I have now completed all my tasks. I am now focusing on getting the final assignments such as the poster, video, and report done.

Next Week’s Deliverables:

  • Complete poster
  • Complete video
  • Complete demo
  • Complete report

Team Status Report for 04/26/2025

This week, we finished implementing the mister actuator, the heater actuator, and the LED matrix display. We also tested the late fusion machine learning model for plant health classification, fixed several WebApp bugs, and added new functionality to improve usability. Integration between the WebApp, the actuators, and the control system is progressing smoothly, and the automatic control logic has been successfully tested under manual and automatic modes.

Progress:

  • Mister actuator implemented and functional
  • Heater actuator implemented and functional
  • LED matrix displays actuator statuses and plant health classification
  • Late fusion ML model tested successfully
  • WebApp bugs fixed and new features added
  • Sensors and actuators integrated with WebApp
  • Automatic and manual control tested
  • System operating in current greenhouse setup

Significant Risks: The most significant risk at this point is transporting the greenhouse from our current room in HH to the demo room in UC. There is a chance that something could get displaced or broken during the move, and it would be difficult to quickly fix major issues because we only have about 30 minutes to set up. To mitigate this, we plan to arrive early on the day of the demo to carefully move the greenhouse, perform a full functionality test, and ensure everything is working before entering the demo area. We will bring backup materials, tools, and spare sensors just in case.

Design Changes: No major changes were made to the overall system design, requirements, or block diagram this week. The system has now fully met the original specifications, and all actuators and sensors are operational. Any small adjustments were purely for bug fixes and user interface improvements on the WebApp. These had minimal cost impacts and were necessary for improving the reliability and user experience of the system.

Next Steps:

  • Finish testing stabilization times of the greenhouse system
  • Fix a few minor WebApp bugs
  • Finalize WebApp user interface
  • Complete the final poster, demonstration video, and written report
  • Prepare and practice for the final demo presentation

Testing:

The following are the tests we carried out over the system:

Unit Tests:

  • Sensor Accuracy tests (Zara)
    • We compared the temperature and humidity sensors to calibrated measuring equipment and found that the sensors were within 1.51%
    • For the other sensors we could not afford calibrated equipment within our budget for testing, so we opted to test relative changes of collected data in response to changes in the system by actuators. We found that light, and soil sensors responded appropriately to changes in the environment, (e.g. for each step of light intensity there was a change of at least 200 lux consistently, soil moisture responded differently for each plant, but consistently deceased when not watered for days, and increased after regular watering)
  • Sensor Latency (Zara) 
    • We sent data from our hardware to webapp and compared timestamps to find the latency
    • We found that sensors connected directly to the raspberry pi had a 0.5s latency from sensor to webapp
    • Sensors connected to the arduino had a 2.8s latency as information travelled from sensor to arduino to raspberry pi to the webapp
  • Actuator Accuracy (Zara)
    • We compared set values to actual values reached after stabilization
    • We found that for all cases in light intensity, increasing heat, increasing humidity we are able to reach target values 100%. But if the greenhouse required a cooler temperature plant than room temperature, as we do not have a cooling system we could not cool to reach the target temperature. As we do not have the time to implement a new system for this we aim for cooling naturally through the mister.
  • Actuator Stabilization time (Zara)
    • We plan to measure the time between the set time and the time it takes for the system to stabilize
    • This is still in testing as we are collecting data to measure this time
  • Webapp User Experience (Yuna)
    • We plan to have users rate 1-5 on ease of use, design, functionality, performance
    • This is yet to be tested
  • Live Streaming Latency (Jana):
    • We tested the live streaming latency by comparing the timestamp of when a frame was captured by the RPi camera to the timestamp of when it was displayed on the WebApp. We tested this for about 1670 frames, and found that the average latency is 1.95 seconds, which is below our requirement of latency < 2 seconds.
  • Plant Identification API (Jana)
    • We tested the Plant Identification API on our dataset of 746 images of plants (applying random transformations such as rotation, noise, blur, etc), and it correctly identified each of the 4 species every single time (TPR of 100%, Recall of 100%). 
  • ML Plant Health Model (Jana)

I tested the ML Plant Health model in 4 stages:

Stage 1: Image model (online data)

I tested the 3 models (ResNet18, ResNet50, and MobileNetV2) on 2 different online datasets (PlantDoc, and an open source houseplant dataset on roboflow).

I got the following results:

Houseplant dataset:

ResNet18: 

  • True Positive Rate (TPR): 0.8302
  • True Negative Rate (TNR): 0.9491
  • False Positive Rate (FPR): 0.0509
  • False Negative Rate (FNR): 0.1698

ResNet50:

  • True Positive Rate (TPR): 0.8302
  • True Negative Rate (TNR): 0.8889
  • False Positive Rate (FPR): 0.1111
  • False Negative Rate (FNR): 0.1698

MobileNetV2:

  • True Positive Rate (TPR): 0.8774
  • True Negative Rate (TNR): 0.9537
  • False Positive Rate (FPR): 0.0463
  • False Negative Rate (FNR): 0.1226

 

PlantDoc dataset:

ResNet18: 

  • True Positive Rate (TPR): 0.9653
  • True Negative Rate (TNR): 0.9176
  • False Positive Rate (FPR): 0.0824
  • False Negative Rate (FNR): 0.0347

ResNet50:

  • True Positive Rate (TPR): 0.9422
  • True Negative Rate (TNR): 0.8824
  • False Positive Rate (FPR): 0.1176
  • False Negative Rate (FNR): 0.0578

MobileNetV2:

  • True Positive Rate (TPR): 0.9538
  • True Negative Rate (TNR): 0.8941
  • False Positive Rate (FPR): 0.1059
  • False Negative Rate (FNR): 0.0462

Based on these results, I chose the ResNet18 model trained with the PlantDoc dataset.

Stage 2: Sensor Model (online data)

I trained a simple MLP classifier on online plant sensor data, and achieved the following results:

  • True Positive Rate (TPR): 0.9778
  • True Negative Rate (TNR): 0.9333
  • False Positive Rate (FPR): 0.0667
  • False Negative Rate (FNR): 0.0222

Stage 3: Late Fusion Model (our collected dataset)

Our dataset consists of 746 images and sensor data points of 14 different plants across 4 plant species. I fine-tuned the image and sensor models with our dataset, and trained a classifier using the concatenated sensor and image features from the pretrained image and sensor models. This achieved the following results:

  • True Positive Rate (TPR): 0.9091
  • True Negative Rate (TNR): 0.9496
  • False Positive Rate (FPR): 0.0504
  • False Negative Rate (FNR): 0.0909

Stage 4: Multi-Class Classification (online data):

We initially planned on developing a model that can predict specific plant health labels such as ‘overwatered’ or ‘underwatered’. Despite the limitations in available datasets, I decided to train a multi-class model using one of the more suitable datasets (houseplant dataset on roboflow) to see if further classification into more specific categories could be possible. The results show significant variation in performance across different categories, which is why I decided against using it in our project:

                        TPR       FPR       FNR       TNR

healthy             0.986111  0.207547  0.013889  0.792453

bacterial spot      0.888889  0.000000  0.111111  1.000000

dehydration         0.687500  0.003268  0.312500  0.996732

mineral deficiency  0.666667  0.003165  0.333333  0.996835

sunburn             0.680000  0.010101  0.320000  0.989899

late blight         0.571429  0.000000  0.428571  1.000000

leaf curl           1.000000  0.000000  0.000000  1.000000

overwatering        0.571429  0.003175  0.428571  0.996825

rust                0.562500  0.009804  0.437500  0.990196

powdery mildew      1.000000  0.003175  0.000000  0.996825

So overall, our ML plant health classification model has a FPR of 5.04%, and a FNR of 9.09%. We had initially stated that we wanted a FPR of less than 10% (which was achieved), and a FNR of less than 5% (which was not achieved). Given that these were the best results achieved, I will have to update the design requirements to reflect that. Our new targets are a FPR <10% and a FNR <10%. 

 

Overall Tests:

  • For overall systems testing we manually performed end to end testing the following systems 
    • Plant Identification through webapp (Jana and Yuna)
      • Fully testing the process of inserting a plant and having it display on the webapp for the user if they do not know the plant 
      • Initial tests found this working successfully with the API, however, sometimes delays in MQTT protocol cause the API to timeout, to fix this we increased the timeout for the call and ensured faster MQTT times buy not overloading the RPI’s memory and implementing threading for MQTT calls
    • Plant Health Classification through webapp (Jana and Yuna)
      • Fully testing the user being able to get a recent health status of the plant through the webapp
      • A successful classification is visible on the webapp of the plant’s health through a capture of the plant’s image and classification within ~5s
    • Manual actuator control through webapp (Yuna, Jana, and Zara)
      • Testing turning on and off all the actuators through the webapp
      • All work with a latency of <0.5s
    • Automatic actuator control through webapp (Yuna and Zara)
      • Testing switching to automatic control of the system and whether it controls
      • Successfully works and the user is able to changes the automatic schedule
    • Sensor data collecting through webapp (Yuna and Zara)
      • Test to see whether sensor data successfully appears on the webapp for the past 24 hours
      • Successfully viewed on the webapp with graphs displaying all data

Zara’s Status Report for 04/26/2025

This week, I completed several important components of the project. I finished integrating the heater into the system, ensuring that it is now stabilized and works securely with both manual and automatic control modes. Additionally, I completed building and programming the LED matrix, which displays the current status of all actuators in the greenhouse, as well as the health status of the plant based on machine learning classification (either “Healthy” or “Unhealthy”). I also finished writing the automatic control code for the greenhouse, allowing the system to adjust conditions based on ideal environmental parameters. After writing the control code, I tested the sensors (temperature, humidity, soil moisture, and light) and some actuators (heater, fan, humidifier, and lights) to verify that the automatic system responds correctly to real-time data.

Overall, I am on schedule with the project timeline. There are no major delays, and the heater, LED matrix, and control code were the last major technical tasks I needed to complete individually.

For the final week, I will focus on completing stabilization tests to record the time it takes for the system to reach ideal conditions automatically. I will also work with my team to finish the final poster, demonstration video, and project report. By the final deadline, the deliverables I expect to complete are the poster, demo video, report, and the stabilization time data for the system.

Yuna’s Status Report for 04/26/2025

This week I found so many unexpected bugs while manually testing the web app. I spent most of my time fixing the bugs and adding small details on the web app, focusing on the functionalities first. I also focused on adding functionalities that will be useful for demoing purposes.

  1. MQTT speed issue: We suddenly had plant detection being executed very slowly and our MQTT messages being received very slowly. It turned out that the Raspberry Pi was full of storage and they were functioning slowly. I spent much time trying to figure out what the problem was.
  2. Sensor data stored for each plant: The sensor data were just being stored in the first plant, but I adjusted so that users can select the “current plant in the greenhouse”, and save sensor data specific to that plant.
  3. Delete plant: For easier demo, I added a button to delete a plant.
  4. Easier autoschedule: In the manual auto-schedule page, I added the original autoschedule so that users can just change the condition they want, not having to refill the entire conditions.
  5. Security: When setting up the auto-schedule, I set lower and  upper bounds so that users don’t abuse our system to kill plants.
  6. Small details: I improved UIs of the web app by adding last detected time for plant health detection, putting “camera is turned off” when camera was off instead of displaying a stationary livestreaming image, and changing confusing wordings.
  7. Manual testing of web app: I manually tested the web app that all the core functionalities are working correctly.

I am slightly behind the schedule since I wasn’t able to conduct the user experience survey, but I’ll finalize the web app by Sunday and get survey data by Monday to show our results on the poster.

Next Week’s Deliverables:

  • Finalize web app by Sunday
  • User experience survey by Monday
  • Test different scenarios, make sure everything works before the demo

Zara’s Status Report for 04/19/2025

This week I worked with Jana in setting up the blackout film for the windows, as well as bought more plants for the greenhouse. Personally, I finished setting up the heater and watering system for the plants. As the plants don’t seem to be absorbing enough water as of now, I am experimenting with different ways of raising the water source as well as pre-watering the plant to make sure the plant watering ropes are working as intended. I have also tested for sensor accuracy and started testing the actuator accuracy (in particular, the heater response), and measured the sensor-to-web latency. For a fun side part, I am also adding an LED matrix connected by an Arduino to the front of the system to represent the status of actuators in the greenhouse, and have mostly completed the code for this. 

Progress is now on schedule, and by the final week, I just need to secure the heater and test the stabilization times for all the actuators in the greenhouse, as well as finish all the testing. Throughout the project, as I did not have much previous experience in embedded systems, I learnt a lot implementing the different hardware in the system. I learnt mostly about different communication protocols I had to implement, including MQTT, and serial communication (between RPi and Arduino as well as through the RS485 method). For implementing all the hardware, I found myself mostly reading documentation, following articles of similar projects, as well as watching youtube videos of tutorials of simple parts.

Jana’s Status Report for 04/19/2025

This week I focused on getting the misters working. I encountered an issue where other actuators interfere with the mister activation due to their reliance on rising edge triggers. As a result, one or two misters may be triggered unintentionally when other components are toggled. I’ve been investigating this by rewriting portions of the GPIO handling code and running repeated tests under different actuator activation scenarios. Zara and I also bought more plants for testing and data collection, and we covered the greenhouse in privacy/blackout film to improve environmental control and reduce lighting variability for the vision-based systems.

Another major focus was testing. I implemented the late fusion ML model and began training and testing on the data I have collected so far (although data collection is still ongoing), and currently the model achieves a FPR of 4.90% (below our required 10%), and a FNR of 7.27% (which is above our required 5%), which is promising. I have also tested the plant identification API on the image data I collected in our greenhouse, and it correctly identified the species of the four types of plants we are testing every time. Finally, I tested the live streaming latency and found that it was 1.95 seconds (below our requirement of 2 seconds). 

I am running slightly behind schedule as I had hoped to complete data collection by now, however, I need 1-2 more days of data. To mitigate this, I have updated the schedule to allow us to use the slack time we previously allocated for next week, which I will use to test the whole system with plants in there for up to a week.

Next Week’s Deliverables:

  • Complete data collection (should be done by Monday morning)
  • Once data collection is done, finalize ML testing results
  • Debug mister
  • Test entire system on plants for 1 week

Prior to this project, I had no experience working with microcontrollers or sensors, so I had to learn how to use components like the Raspberry Pi, relays, the Pi camera, sensors, and other hardware. I relied heavily on official documentation, video tutorials, and online project walkthroughs, which were especially helpful for tasks like setting up the relays and configuring the actuators. On the machine learning side, I had also never worked with multimodal data before, so I needed to learn how to integrate image and sensor data into a single model. I learned primarily through online articles, tutorials, and by reviewing research papers discussing similar architectures. These resources were very helpful in deepening my understanding and guiding implementation.

Team Status Report for 04/19/2025

All of our sensors and actuators have been integrated with web app. We are done with implementing the basic requirements of the project . The web app now has all the core functionalities – to automatically/manually control the plant conditions, detect the plant species and health, store sensor data, and live stream. We tested some of the subsystems – live streaming latency, late fusion network, plant identification API, and level of water absorbed per day.

Progress:

  • Sensor data collection
  • Put blackout film on windows
  • Bought more plants
  • Set up mister, heater, watering system
  • Implement late fusion network for plant health ML model
  • Test live streaming latency, late fusion network, plant identification API, level of water absorbed per day
  • All the sensors/actuators fully integrated with web app
  • Automatic and manual scheduling functionalities implemented
  • Enable water/nutrients dispensed based on the number of plants in the greenhouse

Next Steps:

As we have a final presentation and a final demo left, we will have to implement everything, test everything that we planned before, and refine web app and the system to be presented in the final demo.

  • Test PID controls
  • Test overall system performance (leave the greenhouse barely touched/modified for several days for testing)
  • Ensure durability (waterproofing/heating/misting)
  • Improve web app UI
  • Conduct web app user experience survey

Yuna’s Status Report for 04/19/2025

This week I mostly spent time on fixing bugs, adding small details, and improving UI. I also worked on final presentation slides and spent time preparing for the presentation.

  1. Manual health check button: Users can now manually request health check.
  2. More actuators/sensors integrated with web app
  3. Starting/stopping the actuators: I helped Zara with the code for starting/stopping the actuators based on the auto-schedule.
  4. Options of Automatic<->Manual Scheduling: If automatic state, the actuator controls and auto-schedule update are disabled.
  5. # of plants info stored: Users can now input/change the number of plants in their greenhouse. The amounts of water/nutrients dispensed will be based on the # of plants.
  6. White light control in Live Streaming Page: Users can control white light in the live streaming page to better see the plant.
  7. Detected Result Page: I added a page that tells the users the plant species detected by the API, as shown below:
  8. Keeping track of Actuators&Camera Status: The pages that have switches for camera and actuators now display the current on/off status.
  9. Fixed previously undetected bugs

I am slightly behind the schedule because I only manually tested the web app and haven’t written the test code yet. I will make sure to finish writing test code by early next week.

Next Week’s Deliverables:

  • Focus on small details/UI for the last time: I will improve small details by early next week to leave time for writing the test code.
  • Test Code: I’ll mock different scenarios by writing test cases and verify the web app works as expected.
  • User Experience Survey: As planned, I will ask 10 different users to rate their experience in using our web app.

As I’ve designed, implemented, and debugged my project, I had to self-learn a number of new concepts and technologies. Although I had some experience with React, I still wasn’t an expert in it; I have never had experiences in MQTT protocol, RaspberryPi, websockets, deploying web app on the RPi, or converting http to https. I had to make good use of online resources to learn about all of these concepts. For most of them, I read the official/unofficial documents and took a look at the example code in the documents. I also referred to other people’s code on github and forum posts on different websites as well because many official documents were not detailed enough for me to fully understand the applications.  Whenever I had bugs that looked unfamiliar to me, I used websites like StackOverflow and Raspberry Pi Forums to see if others had the same problem. The forums also helped me in the designing process – they gave me ideas of what tech stacks may be useful for our project.

Zara’s Status Report for 04/12/2025

This week I first worked with Jana on making the greenhouse water-resistant as we sprayed water-resistant sealant over it and put the components back in. We have tidied up the system and secured the lights and the water pump. I also got the 7-in-1 soil sensor working fully and is incorporated into the main function. There were a few issues in getting it working at the same time as the temperature sensor, however, I managed to resolve them by using different packages for the code. I have now incorporated all sensor data to be sent on the webapp and they are running constantly now to collect data. For the RPI camera, I have also laser-cut a mount for it so it can be stuck on the greenhouse and the angle of the camera may be adjustable depending on the plant inserted.

My progress is mostly on track, I will need to resolve the finalissues with the heater thisweek. I have also received the final water pump for the nutrients so I will set it up in the upcoming week. In terms of actuator code control, I will need to record the water flow rate so I can set up automated water pumping and nutrient pumping.

Yuna’s Status Report for 04/12/2025

Progress I made this week:

  1. Plant Species Detection: When the user tries to add a new plant and they don’t know what the species is, the web app now detects the plant species using ML.
  2. Manual Auto-scheduling: I added a manual auto-scheduling page for users to manually control the plant care conditions if their plant is not in the webscraped database. The user can now set their own auto-schedule.
  3. Chrome Notifications: I implemented a notification system using Chrome Notifications API to notify the users whenever the plant conditions are unhealthy or the sensor data goes beyond the ideal threshold. (The original plan of using Twilio API for notifications have been changed to chrome notifications due to cost issues.)
  4. Camera On/Off: The camera can be now turned on and off using a switch on the web app, allowing users to control security.
  5. Deployment on RPi: The web app has been deployed to RPi. It was initially using http, but I realized chrome notifications API requires https instead. Now the website can be accessed in https url.

I am currently a little behind schedule because some of the features in the web app were not fully implemented and verified, but I’ll make sure to finish everything by early next week to leave time for testing.

Next week’s deliverables:

  • Auto-scheduling Feature: fully implement the auto-scheduling feature and verify it works. Currently there is code for making sure the conditions change according to the schedule, but haven’t tested if it works.
  • More Sensors/Actuators Integration: Our team has some sensors and actuators that haven’t been fully integrated to the system yet, so I’ll work on integrating them with web app.
  • Focus on details: fix small details in the web app – for example, currently the switches for turning on/off the actuators do not know the current status of the actuators. I will make sure the web app gets notified of the current on/off status of the actuators from RPi.
  • Tests: write tests for the web app code. Test if the system works.