Weekly Progress Log

03.02 - 03.08

Team

Following the design presentation, we created our design proposal document, which outlined key details of our project.

We ordered our parts to begin the first tasks of our project schedule. The parts included the sensors we are testing (outlined in our design document) and the LoRaBug with the CC 2650 Launchpad.

Finally, we created our project's website to contain our weekly progress logs.

Raghav

  • Looked into LoRaBug documentation specifics to familiarize myself with board
  • Working on acquiring a LoRaBug

Zeleena

  • Received sensors, chose to use an Arduino for initial reads from sensors
  • Ran small Arduino program to read from the SparkFun Human Presence Sensor (AK9753) and observed output values from a setting with a human very close by, and a setting with a human on the other side of the room
        • Briefly analyzing the output, the temperature reflects humans being nearby, and large negative values for the directions indicate that a human is farther away if the values are large
Output from Human Presence Sensor with Human Nearby


Output from Human Presence Sensor with Human Across the Room


Jacob

  • Created device on OpenChirp, created script to poll transducer data.

03.09 - 03.22

Spring Break: 03.09 - 03.18

Raghav


Zeleena

  • Ran a small Arduino program to read from the Ultrasonic Sensor (HC-SR04) and observed output values from objects different distances away
        • The sensor outputs a value that can be converted to the appropriate units. The picture below is the output on the Arduino's Serial Monitor that depicts an object's change in distance from the sensor.
Output from Ultrasonic Sensor


  • Revisiting the Human Presence Sensor, I ran an Arduino program that plots the sensor values on the Arduino's Serial Plotter. The spikes indicate someone getting close to the sensor by a certain threshold. Further analysis of the output will be done in the future.
Human Presence Sensor Plotted



Jacob

  • Wrote the script to run commands against OpenChirp devices
  • Ran a small Arduino program to read from the PIR motion sensor and observed its output (binary)

03.23 - 03.29

Team

Discussed and narrowed down possible sensors for the final sensor configuration - specifically, if the cost isn't too high, one of the best configurations is 2 GridEyes possibly with a PIR motion sensor for additional data. This conclusion is based on the consistency and processing of output values from some preliminary testing, and the need to expand the GridEye's field of view from 60 degrees.

Raghav

  • Investigated LoRaBug + OpenChirp compatibility.


Zeleena

  • Brought up the remaining sensors we wanted to try, including the GridEye IR Thermal Camera and the Triple-Axis Gyroscope.
        • The GridEye was determined to be one of the best sensors in accurately detecting a human and outputting consistent values that can be easily processed, example output below.
Homeostasis: Output from GridEye at Room Temperature


Showing Directionality: Output from GridEye when at Top of Sensor (Matrix Values on the Left are Higher)


At a Distance: Output from GridEye at an Arm's Length Away (Higher Values Highlighted)




Jacob

  • Investigated LoRaBug + OpenChirp compatibility.

03.30 - 04.05

Team

Midpoint demo consisted of a proposed sensor configuration, end-to-end system, and preliminary sensor data processing. The end-to-end system consisted of the PIR motion sensor values processed in the LoRaBug and updated on OpenChirp.

Raghav

  • Got a real-time task that reads GPIO input on LoRaBug and send it to the OpenChirp server.
  • Interface with LoRaBug buttons and LEDs


Zeleena

  • Began initial sensor data processing with a combination of the Grid-Eye and PIR motion sensor.
  • Reported sensor testing results and researched characteristics of sensors for our sensor comparison.


Jacob

  • Researched characteristics of sensors for our sensor comparison.

04.06 - 04.12

Team

Developed a plan for finalizing sensor configuration given the feedback from the midpoint demo, split up concurrent tasks moving forward

Raghav

  • Acquired SensorBug and began exploring its capabilities (assessing if the accelerometer is an option) and developing


Zeleena

  • Began developing the web user interface including integration with OpenChirp API
  • Began first draft of prototype casing used CAD


Jacob

  • Designed PCB
  • Sent order for more parts including more sensors

04.13 - 04.19

Team

Waiting on sensors and other parts to come in to make progress with the sensor configuration.

Raghav

  • Explored GridEye integration with LoRaBug and SensorBug


Zeleena

  • Drafted and designed appearance of the web interface
  • Worked on UI in HTML/Javascript
  • Began integrating Python scripts with UI


Jacob

  • Worked on PCB design including accelerometer

04.20 - 04.26

Team

Finalized sensor configuration

Raghav

  • Worked on embedded systems task structure to read the number of interrupts on the PIR and transmit the data to the LoRaBug
  • Worked on using the accelerometer


Zeleena

  • Worked with Jacob to convert the python polling script to JavaScript, to be ran on the user side
  • Finished the UI to the point where table color on the webpage will change based on the values polled every x seconds from OpenChirp. The map is currently just our demo table since it is not deployed yet
  • Helped Jacob create prototype casing


Jacob

  • Created prototype casing
  • Helped convert python OpenChirp polling script to be compatible with the UI (JavaScript)
  • Worked with Raghav on making interrupts work with the LoRaBug

04.27 - 05.03

Team

Ordered and received parts for the final demo

Raghav

  • Working with LoRaBug
  • Working on interrupt code


Zeleena

  • Soldered PCB
  • Polished UI


Jacob

  • Completed casing
  • Added sensors to the final product