Kanon’s Status Report for 5/8

This week,  we all met in person and focused on combining our own parts into one to produce our final product. I also implemented a functionality on the website where you can set your lighting schedule (pic: https://www.dropbox.com/s/vtao0fk6bdl1m7b/Screen%20Shot%202021-05-08%20at%201.06.46%20AM.png?dl=0).

Hiroko and I had another testing process to make sure that users can control all parameters including heating, soil moisture, and lighting schedule. I also worked with Sarah to integrate her live stream onto the website.

Because our final demo is coming up next week, we also created our final poster. We had a bit of hard time to integrate all the information within one page because we had a lot of materials that we wanted to cover. We also started to make our video and are hoping to finish it by Sunday.

Next week, I will be working on the final report and summarize how much money I have used within AWS credits.

Kanon’s Status Report for 5/1

This week, I worked on testing the metrics between hardware and web app/database and also created a script to periodically monitor the information of the greenhouse. I also worked on the final presentation slides preparation.

Hiroko and I worked on the testing together. We were aiming to have time lag under 1 hour to send the data from web app to ESP32 from our initial presentation. It came out that it only takes around 2 seconds, which was a good surprise. For the transmission between ESP32 and DynamoDB, it also takes around 1 or 15 seconds. It seems like it takes around 15 seconds when we are on different wifi and 1 second when we are on the same wifi. However, because the time different seems so big, Hiroko and I are planning to test this out again this weekend.

My new script should constantly get the information of the greenhouse every 15 minutes. Once the user’s preferred goals are set, the script should detect this change and send appropriate signal to ESP32.

So far, I am on schedule. Next week, I will meet with Sarah in person and try to integrate her live video into the website. I am also aiming to improve the webUI because it seems a bit blunt right now if I have time.

Team Status Report for 4/24

Last week and this week, we prepared for interim demo, ethics assignment, and also worked on our individual components. Hiroko had some trouble this week because the junction box for assembling hardware did not arrive on time. This delays Hiroko&Kanon’s testing process, which they will immediately start working on starting this week. Kanon refined the website UI based on the surveys she took and improved the backend algorithm so that website and hardwares can communicate more smoothly. Sarah worked on refining her image processing algorithm for the recognition of the diseases. She also implemented Twilio API to notify the user about the plant growth/change.

As a team, we seem to be slightly behind the schedule because of the delay in arrival of the junction box, testing process, and some issues with image processing. We will try to catch up next week by focusing on the testing process. Also, because everyone from the team will be in Pittsburgh, the integration of the entire system should be more smooth.

Kanon’s Status Report for 4/24

This week, I focused on improving the webUI and backend algorithm based on the survey that I took two weeks ago. Our initial design of our website always showed current greenhouse information (i.e. temperature) under “User Settings” box. However, this is very confusing because even if the user saves the new settings, it will always show the current greenhouse information. Professor Gary and some of our survey results also pointed this flaw out, so I created another data model to save user settings. Now the users can actually see their most recent settings under User Settings. The users should now able to turn off/on light through the website, but Hiroko and I did not have time to check this out yet. There was a minor bug related to changing temperature unit, so I also fixed that issue.

Below is how our current website looks like.

Next week, once Hiroko is done setting up her hardwares in the junction box, we will meet up to test the communication between hardware-software once again. Sarah is also coming to setup the CV equipment at Hiroko’s place next week, so we will figure out how we should post the video onto our frontend. So far, I am on schedule, but assembling everything does seems a bit difficult, and I am also getting my second pfizer shot (which seems to have a bad side effect), so I will try to have a head start as soon as possible.

Kanon’s Status Report for 4/10

This week, I first worked on deploying our website onto EC2 because I was supposed to do this last week. I had a bit of trouble doing this because I had to set up boto3 library and AWS CLI on EC2 instance. I kept getting a boto3 error which made me unable to set up my database model. However, after working for a while, I was able to set them up. I still need to set up apache and check if DynamoDB works successfully.

I also conducted a survey with 5 people. It seems like the website was still buggy and 2 people had a problem logging in with GoogleOAuth. I might take this away depending on whether I can find the specific bug corresponding to OAuth. Most of them liked the simple design of the website and found it very easy to navigate through the page. However, they also pointed out that there can be some improvements made such as letting the user enter the value directly instead of letting them use the slide bar.

I think I am still a bit behind schedule. Even though I have an experience of deploying a website through EC2, because I am using AWS CLI and DynamoDB for the database, the deployment process is more complicated that I expected it to be.

Next week, I will continue working on deployment by setting up apache and checking if the DynamoDB works correctly. I will also set up a simple notification script by using Twilio API.

Kanon’s Status Report for 4/3

This week, I continued working on sending data from website to ESP32 that I was doing since last week, and also worked with Hiroko to calibrate sensor values.

Our website can now send data to ESP32 once the user changes the parameters (i.e. temperature, soil moisture) on the user page. The algorithm is pretty simple. If the user sets the temperature or soil moisture rate higher than the current value from the greenhouse, the website backend will send a binary signal to ESP32, and ESP32 will turn on the corresponding components such as heater or water pump. The LED light can be turned on and off in a similar manner.

The following is the data that we receive from ESP32 and is stored within AWS DynamoDB. As you can see, the soil_1 and soil_2 moisture sensors have values that are around 3000 or 2000. Hiroko and I had to calibrate this value so that the user can see them as a percentage.

This week, my actual schedule was to deploy the website to EC2, however, we had to work on calibration and had to play around with how we send data to ESP32, which can be done more easily when we work locally. Therefore, I decided to deploy the website next week, and I am off schedule for a week.

Next week, since initial schedule was dedicated for UI testing which is not a lot of work, I am aiming to deploy the website and also conduct UI testing at the same time.

Team Status Report for 3/27

While most of our work was still individual, starting last week, we started to combine our components little by little.

Last week, Hiroko and Kanon worked together to connect ESP32 to AWS components. We are now able to send data from ESP32 by AWS Lambda functions, store the data within AWS DynamoDB, fetch the data and print them on website, and send back data to ESP32 using curl command via our backend.

Hiroko also worked on setting up the hardware devices by soldering them. She also got a relay controlled outlet box to improve our design aesthetic and also user safety. She is now working on connecting sensors properly and calibrate those values.

Sarah had a difficult time because her Raspberry Pi broke last week. However, by borrowing her friend’s RPi, she was able to continue working. She can now show the 24/7 live stream directly on the website frontend. She also tested some openCV functions on real plants and completed the growth stage classifier.

Because there was some unexpected problem such as RPi breaking, we are glad that we made a week of slack within our schedule. There will be no change on the update for our schedule, and we will be still working individually next week to refine each of our components.

Kanon’s Status Report for 3/27

Last week, I worked with Hiroko and mainly focused on connecting ESP32 to AWS. I set up AWS IoT, DynamoDB, and Lambda so that I can receive data from ESP32 via Lambda functions and store them inside of DynamoDB. I was also able to send data from my terminal to ESP32 by using API endpoint and curl. Therefore, the website can now show the most recent status of the greenhouse by fetching data from DynamoDB.

This week, I focused on improving the website UI a bit and also wrote a function so that curl command can be sent from python script or website backend, instead of sending the command directly from terminal.

So far, I think I’m on schedule. I still need to figure out how to send back data efficiently to ESP32 but this can be done as we calibrate our sensors. Next week, I will be working on deploying the website by AWS EC2.

Kanon’s Status Report for 3/13

This week, I worked on practicing for the design review presentation, setting up dynamoDB, and writing a draft for Design Review Report.

For design review presentation, there was a lot to cover within each slides so I had a hard time coming up with a script that fits within 12 minutes. In fact, I feel like I spoke a bit too fast while the presentation. However, looking back at the peer review, I figured out that most of our classmates were happy with what we covered within the presentation, so that was a relief.

For setting up DynamoDB, it was a more complicated task and took more time than I expected. There was multiple ways to set up DynamoDB and AWS official document was sometimes really unhelpful. I over-spent some of my time researching for other documentations. I was eventually able to set up my database locally and had sample code running. However, there’s a way of using NoSQL locally with AWS and I think this is an easier way to visualize our database so I might go with this route next week.

I also spent some time coming up with a draft for Design Review Report by writing Architecture and/or Principle of Operation section.

I think I am still on track for the schedule but also realized that the setup for AWS components may take a while. Next week, I will continue working on linking DynamoDB to our web application. I will also a draft Design Trade Studies for web application component so that we can talk together about this on Monday.

Kanon’s Status Report for 3/6

This week, I continued working on web application development. I implemented our own login/registration functionality by using Django authentication. I had some trouble because the default authentication by Django only accepts the pair of username + password while it does not accept the pair of email + password. Because the users log in via Google OAuth2, using their email address, I might change the authentication settings in more detail to make the login parameters the same.

I also researched some more about AWS credits. It seems like EC2 costs the most so I would need to remind myself to turn off EC2 everytime I’m not testing/using the website once I deploy it. Moreover, for the notification/alert, it turned out there’s a website called Twilio that provides SMS API, so I’m going to use this API to send out notifications.

Lastly, as other team members did, I worked on the presentation by coming up with a more detailed block diagram and solutions for the web application component. I will also be the one to present next week so I have been practicing the presentation too.

Next week, I will start setting up DynamoDB and try to link the dataset to Django backend so whenever a user changes the parameters/value on the website, that data will be sent and stored within the database.