Sarah’s Status Report for 3/27

This week, the team and I continued to work on our individualized parts. I was able to complete the live streaming script for 24/7 monitoring of the greenhouse, and I am hoping to link that to our website once the RPi is sent to Kanon and Hiroko.  Instead of doing HSV Color Detection and Edge Detection on online images of pea shoots, I was able to do that with images through my RPi of some of the succulents I have at home so that the CV is applied on realistic greenhouse/outdoor lighting. Currently, I am working on getting the CV to distinguish leaves and flowers, and I’ve completed my pixel per metric by measuring the bounding box that outlines the stem and top of the plant in the image to the real size of the succulent that I was testing. I figured out a way to work with the RPi without using a monitor or keyboard and instead using a VNC viewer which connects to my RPi through Wifi and displays the RPi OS right to my computer, which allows me to test my CV analysis through real time videos from the RPi instead of through RPi images . I also planted some of the pea shoots so that some can sprout by next week and I can properly test my growth stage categorization algorithm on pea shoots rather than the succulents.

I am slightly behind my tasks, as I wanted to test the night vision and get the flower and leaf recognition implemented, but I completed the growth stage pixel per metric algorithm and refined my HSV Color Detection and Edge Detection. My RPi stopped working in the beginning of the week, so I had to borrow my friend’s RPi until my new one came through.

Next week, I hope to be able to test my growth stage classifier with the sprouting pea shoots. I would like to finish my implementation on leaf and flower recognition, so that I can have the minimum to test the defect detection which would use another layer of HSV Color Detection on the parts of the leaf and flower. I will also make sure that the 24/7 monitoring is applicable during night.

.

Team Status Report for 3/27

While most of our work was still individual, starting last week, we started to combine our components little by little.

Last week, Hiroko and Kanon worked together to connect ESP32 to AWS components. We are now able to send data from ESP32 by AWS Lambda functions, store the data within AWS DynamoDB, fetch the data and print them on website, and send back data to ESP32 using curl command via our backend.

Hiroko also worked on setting up the hardware devices by soldering them. She also got a relay controlled outlet box to improve our design aesthetic and also user safety. She is now working on connecting sensors properly and calibrate those values.

Sarah had a difficult time because her Raspberry Pi broke last week. However, by borrowing her friend’s RPi, she was able to continue working. She can now show the 24/7 live stream directly on the website frontend. She also tested some openCV functions on real plants and completed the growth stage classifier.

Because there was some unexpected problem such as RPi breaking, we are glad that we made a week of slack within our schedule. There will be no change on the update for our schedule, and we will be still working individually next week to refine each of our components.

Kanon’s Status Report for 3/27

Last week, I worked with Hiroko and mainly focused on connecting ESP32 to AWS. I set up AWS IoT, DynamoDB, and Lambda so that I can receive data from ESP32 via Lambda functions and store them inside of DynamoDB. I was also able to send data from my terminal to ESP32 by using API endpoint and curl. Therefore, the website can now show the most recent status of the greenhouse by fetching data from DynamoDB.

This week, I focused on improving the website UI a bit and also wrote a function so that curl command can be sent from python script or website backend, instead of sending the command directly from terminal.

So far, I think I’m on schedule. I still need to figure out how to send back data efficiently to ESP32 but this can be done as we calibrate our sensors. Next week, I will be working on deploying the website by AWS EC2.

Hiroko’s Status Report for 3/27

Last week, I was able to go to the ECE lab to solder the light intensity sensor and connect it to the ESP32 board to obtain the sensor readings. Now I have all of the sensors connected to the ESP32, and can obtain all of the sensor readings. I also worked with Kanon to set up the connection between AWS IoT and ESP32, and was successfully able to send data to AWS and receive data from AWS. Also, I spent a substantial amount of time last week finishing up the design review report with my teammates.

This week, all of the components for the relay controlled outlet box arrived, so I soldered and assembled all of the parts. I also wrote Arduino code to be able to control the relays from messages received by AWS, which Kanon and I tested by actually using the website backend to remotely control the relays.

Overall, I think I am on schedule. Next week, I plan on calibrating and testing the individual components of the greenhouse and planting the pea shoot seeds so that they are ready by the interim demo. I want to measure the flow rate of the water pump, light intensity of the LED plant lights, sensitivity/accuracy of the soil moisture sensor, etc., so that Kanon can use the information to create an algorithm to determine when to turn the system on/off.

Team Status Report for 3/13

This week, we started to work on writing the design review report, and continued implementing our subsystems. We split up sections of the report for each of us to write a draft for, and plan on meeting on Monday to edit a draft of our report.

Hiroko started to physically put the greenhouse together since she received all of her orders, and was able to upload code onto the ESP32 board to read data from the sensors.
Kanon worked on practicing for the design review presentation and setting up dynamoDB.
Sarah connected the IR-Cut camera to the Raspberry Pi, created a webcam connection to the laptop with the MicroSD card, and researched tutorials on how to perform CV analysis on live streaming images.

After receiving feedback from our design review presentation, we’ve realized that we need to figure out how to make sure that water won’t get into the electrical parts within our greenhouse. We might need to design a PCB and/or enclosing case for our electrical parts, which might alter our current design.

There is no update to our schedule, but given that we were all busy and spent a lot of time writing the design review report, we were not able to make as much progress on our individual components as we would have liked to. Next week, we plan on finishing up the design review report, and make further progress working on our individual subsystems.

Hiroko’s Status Report for 3/13

This week I received all of the parts that we ordered, and was able to start building the physical greenhouse. I was able to put together the greenhouse shelf and connected the temperature sensor and soil moisture sensor to the ESP32 board. I also installed the Arduino IDE and the necessary sensor libraries on my laptop and was able to successfully upload some code to the ESP32 board to read the sensor readings. I haven’t touched the light intensity sensor since it needs to be soldered.

Also, we spent some time this week starting to write the design review report. I wrote the abstract, introduction, and design requirements, and currently in the process of writing the system specifications for the hardware subsystem.

Overall, I think I am on schedule, but would’ve liked to have the soldering/relay wiring done this week. I’m hoping that I can get a time slot early next week in the labs to solder/wire the relays. Once I get all of the sensors connected, I will try getting the sensor data sent to AWS over wifi. I will also be finishing up the design review report next week with my teammates.

Sarah’s Status Report for 3/13

For this week, the team prepared for the design review presentation and worked on our individual components.

With the equipment all here, I was able to connect the IR-Cut camera to the Raspberry Pi, and create a webcam connection to the laptop with the MicroSD card. I am currently using a random plant in my house to find the right HSV values to extract and the best lighting for CV analysis. I was also following up on tutorials on how to do CV analysis of live streaming images.

I spent time on writing the project management and summary components of our design review documentation, and will be working on the design trade studies and systems review portion of the document.

I am a bit behind my CV analysis implementation, as I wanted to have some sort of analysis implemented through the RPi camera, but the RPi connection took a bit longer than I thought and I had midterms this week. As for the group as a whole, we are on time with our deliverables and will be meeting on Monday to finalize the design document.

By next week, I hope to have the growth stage classifier and half of the disease detection implemented with the extra time I have from no homework in other classes for this upcoming week. I will also start growing one of the pea shoots because I need some testing materials for the growth stages.

Kanon’s Status Report for 3/13

This week, I worked on practicing for the design review presentation, setting up dynamoDB, and writing a draft for Design Review Report.

For design review presentation, there was a lot to cover within each slides so I had a hard time coming up with a script that fits within 12 minutes. In fact, I feel like I spoke a bit too fast while the presentation. However, looking back at the peer review, I figured out that most of our classmates were happy with what we covered within the presentation, so that was a relief.

For setting up DynamoDB, it was a more complicated task and took more time than I expected. There was multiple ways to set up DynamoDB and AWS official document was sometimes really unhelpful. I over-spent some of my time researching for other documentations. I was eventually able to set up my database locally and had sample code running. However, there’s a way of using NoSQL locally with AWS and I think this is an easier way to visualize our database so I might go with this route next week.

I also spent some time coming up with a draft for Design Review Report by writing Architecture and/or Principle of Operation section.

I think I am still on track for the schedule but also realized that the setup for AWS components may take a while. Next week, I will continue working on linking DynamoDB to our web application. I will also a draft Design Trade Studies for web application component so that we can talk together about this on Monday.

Team Status Report for 3/6

This week, we all worked on our individual design block diagrams for the design review presentation, looked into more risk factors, and began implementing our components. We also ordered all the materials we need and we have received most of them. We made a list of bills and materials and are working on the design documentation report as well.

Hiroko looked into the sensors she will be working with and ordered all the materials that she needed and picked up a few from Home Depot. Then, she made a visual design of how the greenhouse will look like and created the hardware block diagram which specified all the relays and feedback loops in the system. She also signed up for access to TechSpark in order to solder the sensors to the ESP32 , and hopes to receive some help from TAs to properly soldering the sensors. Kanon created the website login and registration as well as the main page where there is a toggle to change temperature, a switch for turning on/ off the light, a. soil moisture modifier, and a section for putting the live stream video later on.  She also found Twilio, an API aids in notifications and alerts to users, which would help our project significantly. She updated her block diagram with the Twilio API included. Sarah researched the OpenCV modules more, and layed out the algorithms and components she would need to properly implement a CV application for plants. In specific, she figured out HSV Color detection and edge detection for images. She also received her materials and is setting up the hardware to do proper CV analysis.

Some risks we are looking into is receiving the wrong data from the database, and to mitigate this issue we are thinking about either regaining the data before outputting the value to the website or to notify the website if drastic changes take place. We are also hoping that the night vision in the IR-Cut cameras will work properly, but in the case that it does not we will need to reconfigure the LED lights to be on a certain brightness for night vision to work. After testing the OpenCV module, we found it very important that the subject we are analyzing contrasts with the background or the unnecessary components of an image, so we are looking into making a monochrome backdrop in the greenhouse that provides the best contrast. Any unexpected issues that we may come across with the hardware, we have decided that our system will be able to detect and notify the user about it.

We updated our schedule a bit to figure out when to start planting the pea shoots and which tests we will be conducting on which days. We’ve decided that we would plant the pea shoots a week before testing, as it takes around 7 days for pea shoots to sprout and grow. Below are also some images of visible progress.

 

Web site progress:

Sarah’s Status Report for 3/6

This week, my team members and I researched and designed the components of the projects individually assigned to us. In particular, I looked into several tutorials and found built in functions in the OpenCV module that would help with differentiating the plant from the rest of the video for CV analysis to be done. In speciic, I looked into the pixel per metric technique to figure out the real life size of objects through the camera, HSV color detection and edge detection to detect withering, disease, bending, and flowers/ fruits. Using a pea shoot image I found online, I found certain color hue, saturation, and value that would separate the bundle of pea shoots from the rest of the objects in the captured image such as the soil, the background, and the planting tray. Afterwards, I implemented some edge detection to get a clear outline of the shapes of the stems and leaves. My teammates and I also ordered our materials on Monday, and I have received all the hardware I need such as the RPi, RPi power adapter, RF transmitters and receivers, and planting material.

My progress is on schedule, as my team just needs to make some final adjustments on the design review presentations after receiving feedback, and we are all started on the implementation process of each of our tasks. I learned a lot more about OpenCV this week and have a much clearer idea of how I can use the built in functions to cater to image and video analysis of plants.

By next week, I hope to connect my camera to my RPi, and work with the RPi camera for my CV application. I would like to figure out the HSV parameters out of the images that the camera provides and I will most likely use some plants I have at home to figure out the basics to HSV Color and Edge detection specific the camera and lighting at my place. I would like to accomplish the pixel by metric technique and the flower distinguishing algorithm by the end of next week.