Malavika’s Status Report 12/4

This week was the final presentations and I presented for our group on Monday. I spent most of the week working on the final presentation by putting together slides and meeting with my teammates over Thanksgiving break to outline and fill in our slides. Sunday mostly entailed testing the product in the lab and getting videos for our presentation (as shown below). I got a video of two work stations turning on when controlled from the web server as well as the work zones changing color on the web server’s GUI. I also got a video of the lights on the GUI changing colors when the lights turned on due to sensed movement rather than the web server sending a signal. We also performed testing to illustrate how our product has met our requirements in terms of latency, accuracy, and distance.

Wednesday entailed of watching more final presentations and filling out the peer review google forms. This week we plan on going into the lab on Wednesday to make our final video, do more testing, and have a full scale MVP to demo the final product. The web server is fully functional at this point and supports all the features previously mentioned such as authentication, controlling the individual lights, sensors, setting the weights of each sensor using user input, displaying the current weights, and a GUI to illustrate to the user exactly which zones are on.

Ryan Status Report 12/4

This week I worked on our final presentation and sent out our survey to my peers to begin collecting data on user experience preferences and data on privacy concerns.

I also worked with the group on more testing and adjusting of our weights. As we get more survey responses in, we will use those to guide our final weights. Our goal for this week is to do more testing, finish up our in person demo and create our final video as well as write the final report.

At this point in the project, I am proud of what we were able to accomplish this semester and we are in a good position to finish things up before the final demo.

Diva-Oriane Marty Status Report 12/4

The first goal of this last week was getting ready for the final presentation. I primarily worked on creating diagrams for our updated algorithm and hardware setup.

These diagrams will also go into our final report.

We finished up an initial round of testing and calibration last week, but we want to continue to do so in the coming week. We finalized on a privacy concerns survey and sent it out this week. I looked over the comments made on the design report and want we need to improve on for the final report.

Next week the primary goals are finish up calibration and testing and finish writing up the final report. We also hope to receive sufficient results from our survey.

Ryan Status Report 11/20

This week I worked on testing, refactoring our code to allow for faster testing and verification, and drafting a user survey to guide some of our design decisions. In addition, I soldered and created another “base station.”

(User Survey: https://docs.google.com/forms/d/1YM4OFE08eWwsJJnDHzXr_HDfEiTtsAuOjxWQ_ADB7e0/edit)

The testing I completed involved our first requirement: to turn on the lights within 2 seconds of a person entering a section. This ensures our computation and communication latency from signal detection to computation to signaling the lights is fast enough for our user. In order to capture times, I recorded myself on video, and timed from first movement to the lights turning on for five trials. I obtained the following results. The average time is 1.028s and each trial is below 2s.

Trial,Time
1,   1.2s
2,   0.81s
3,   1.11s
4,   0.97s
5,   1.05s

To assist with further testing, I refactored our code and began developing a script to modify and adjust our weights given values captured and written to a test file along with a desired light behavior. The script uses a gradient descent-like algorithm for adjusting weights and validating defined test cases. Weights will be slightly modified until (hopefully) the desired light behavior is correct for all test cases.

This coming week I will continue testing, send out the survey to collect responses after it is finalized and create another base station.

Malavika Status Report 11/20

This week was mostly integrating all parts of our system together and performing tests. The functionality of the website is complete including bidirectional communication which allows the user to turn one or more of the work zones on, turn off all lights, enable a security protection mode to turn off microphones and motion sensors, and control the weighting of the sensors manually.

I added a range input element in the form of a slider which allows users to control the percentage of weight from PIRs to microphones on their own end sending the information to the Raspberry Pi. Diva and I worked together to design the optimal way for the weights to be implemented when being changed in the system. We decided that there will be two base weights (one for the PIR and one for the microphone) which serve as the value by which the counter for each work station increments when a sensor detects a positive value. Multiplying these by the percentages the user sends to the Raspberry Pi when moving the slider will change the increment value, and thereby a PIR sensor will be “worth more” and increase the counter threshold by a larger value than a microphone, should the percentage be specified as so.

We also performed preliminary testing with three base stations in the 18-500 lab. I worked on some of the circuitry for disconnecting and attaching the LED lights to the breadboards to be set up as stations in the four work zones we will have (we currently have three). I also ordered another set of PIR sensors as we were missing a fourth one for our final station.

The rest of the week primarily entails testing all of our edge and use cases and working on the final presentation which I will be presenting. Diva and I spoke with Professor Yu on Wednesday about our testing plans, and he mentioned it would be useful to think about how to best present our testing data. We need to demonstrate the physical range of our sensors and so we plan on having plots that illustrate this with distance from the center of the PIR in feet on the x axis and the value the sensors (logical 0 or 1) on the y axis. We will also perform a similar series of tests for the microphone application.

We also want to make a user survey to conduct behavioral testing of individuals and their preferences and comfort level with data gathering sensors such as microphones and plan on doing this this upcoming week.

Diva-Oriane Status Report 11/20

I worked on getting two way communication with the ESPs on the work stations and the raspberry pi. The lights are now turned on by a message being sent to them with mosquitto protocol. I integrated the weight messages from and to the web app into the raspberry pi’s base station code. When the weight is changed on the sliding bar of the web app, the raspberry pi changes the weights it uses for the pir and mic data. The raspberry pi keeps track of its default values incase the web app switches back to manual.

WEIGHT:AUTO/Manual:PIR weight:Mic weight

The raspberry pi sends back its current weights to the web app when the web app switches back to auto mode. When it is in manual the web app already knows the weight values.

We were able to scale up to 3 base stations this week, but we are missing a PIR for the 4th base station.

We started some testing, but still have to get some done before the final presentation after thanksgiving. We did tests that allowed us to modify the threshold for the microphones and the PIRs, to pick up reasonable values (reasonable level of sensitivity).

The tests that we still need to be conducted:

-All combinations of people 4 at 4 base stations. i.e. (people at 1, people at 1 &2, people at 1&2&3, people at 1&2&3&4, people at 2&3, …)

-No movement tests

-No noise tests

Ryan Status Report 11/13

This week I worked to solder and set up another base station with a pir sensor and microphone and work with Diva and Malavika to get messages sent from our webapp to the Rasberry Pi base station in addition to doing some initial testing with two base stations.

I tested by doing some work and moving around .5m from one base station and also 1m from that base station. This allows us to observe the sensor values when a person is in our defined work station and also when the person is outside of the defined work station. I wrote the values to text files to allow for backtesting and allow us to tweak our weights with immediate feedback on accuracy. For the most part, our data looks promising, we are getting a lot more feedback from the station I am closest too however there is some noise from both sensors (false positives) so we’ll have to decide how we’d like to handle this in order to maintain our turn on latency.

This week my focus will be to add one more workstation and do some more testing so that I can work with Diva to optimize our weighting.

Malavika Status Report 11/13/21

This week we had the interim demos and showed the Professors and TAs what we have so far of our MVP. On the webserver side, I was able to illustrate the connection the webserver makes to the Raspberry Pi and also the messages it sends to it. For the later demo, I was able to get bidirectional communication working so the webserver can also read messages sent from the Pi by subscribing to a separate topic especially for incoming communications. The webapp publishes messages to the pir/data/web topic. Interaction can be seen in the images below.

The STATION:ON:1 command was received from the Raspberry Pi — currently, the webapp is simply appending the message to the div by adding the string as an HTML element. Ultimately, these messages will be parsed and this page will change the workzone user graphics accordingly.

Diva and I went into the lab yesterday to finalize our commands between the Pi and the server. We also performed testing for the microphones but could not find an appropriate sensitivity. We accounted for all edge cases such as making sure the pir was on while the mic was off when moving and making no sound (and vice versa).

Diva-Oriane Marty Report 11/13

I was able to get the sensor data sending directly from two microcontroller to the raspberry pi by the demo on Monday. We were also able to able to scale up to two base stations.  Malavika and I worked on sending and receiving multiple kinds of messages from the web app to the raspberry pi base station. My role was to integrate the messages into the logic of the raspberry pi base station.

The messages we are able to send from the web app to the raspberry pi base station are the following:

STATION:ON/OFF:Station Number 

Station Number (1,2,3,4 or 5 (all stations))

On – force station to remain on regardless of the sensor data

Off – go back to just working with the sensor data

MICS:ON/OFF

On – use mic data

Off – ignore mic data

PIRS:ON/OFF

On – use pir data

Off – ignore pir data

WEIGHT:AUTO/Manual:PIR weight:Mic weight

This is the only one not yet implemented 

It allows the user to set the weight of pir data and mic data manual if they wish too

The messages we are able to send from the raspberry pi base station to the web app are the following:

STATION:ON/OFF:Station Number 

Station Number (1,2,3,4)

This message is sent from the raspberry pi base station to the web app when a station on/off state changes.

WEIGHT:AUTO/Manual:PIR weight:Mic weight

This is the only one not yet implemented 

Send the current weights of pir and mic data either when prompted or when changed.

I was also able to the the lights working by adding a transistor.

Next week the main goal would be to scale up to 4 base stations and figure out the weights of the sensors and placement of the sensors. We also need to send user privacy concern surveys.

Ryan Status Report 11/06

This week I worked on integrating communication between our sensors and Raspberry Pi to the logic Diva is working on and assist Malavika in tying in the web app as well.  The integration with Diva is mostly complete while the integration with Malavika will require some more work, we ran into some bugs but mainly were focused on a proof of concept and verifying that we can in fact send messages from the web app to the Pi which we have verified.

I worked to solder and set up two stations for our interim demo. Each station consists of a ESP8266 (WiFi) microcontroller, a microphone, and a PIR sensor. I programmed the ESP8266 microcontroller to send the sensor data only when it changes. The idea behind this is for improving latency. The amount of messages sent from all the microcontrollers is reduced and the logic is distributed to these microcontrollers preventing constant checking in our threads. In addition, no information is lost because we can assume the value is the same until it changes.

Here is one of the work stations with our ESP8266 connected to PIR and microphone sensor.

This console output shows the result of some initial testing with full integration for 90 seconds. In the first output, we were closer to workstation0 and our algorithm computed a score of 133 whereas workstation1 had a score of 87.  Next we tested the reverse and got scores of 79 and 140. We still need to test and adjust thresholds however it is very promising to see higher values corresponding to the workstation we were closer to.

This was a productive week and we seem to be close to our original schedule. This upcoming week, I plan to set up more work stations (increasing from 2 to 4). In addition, I will work to integrate more functionality from the web app that Malavika is working on. This will involve including more logic for the messages that will be sent from the web app in addition to the logic for sending messages from the Pi to the ESP8266 microcontrollers.