Status Update 11/3/2018

Aayush:

  • I developed the first few screens of the mobile application.
  • I set up the raspberry pi; installed necessary softwares; and was able to connect it to the network
  • Downloaded openCV on the pi and put eye detection algorithm
  • Made sure that the raspberry pi camera works
  • Also, took pictures of the doll and tested the eye detection algorithm on that to ensure things are set for the mid point demo
  • Going forward, plan is to make required tweaks to the eye detection algorithm so that it works with raspberry pi camera

Angela:

  • I worked on fine tuning the parameters of the sleep wake algorithm using my sleep and wake accelerometer data. Currently, the sleep wake algorithm is able to correctly classify my data using majority vote with acc_x, acc_y, and acc_z. There is occasional misclassification especially at the beginning and end of sleep cycles, but since I am not sure exactly when I fall asleep and wake up, this does not seem too significant.
  • I worked on using an algorithm for audio detection using a similar algorithm as the sleep wake. I am using a convolution with a box signal. If there are a certain number of samples that are higher than some threshold, then this result will be classified as crying (or talking) rather than sleeping. This will be used in conjunction with the accelerometer data. The goal is for these two components to be the main decision making sources when the lights are off/night time. I am still looking for audio of babies crying to test it on.
  • I worked with Aayush to get the Raspberry Pi running. I registered it with the CMU wifi system, so it should be able to use CMU wifi. We will be working on running our code before midpoint demo.
  • Goal: Integrate the programs onto the Raspberry Pi and make sure it works.

Priyanka:

  • I was able to get the Bluetooth module to work. The problem seemed to be that the teensy module was not connected to the ground and therefore the Bluetooth module was not able to send data but it was able to receive them.
  • I am able to connect accelerometer to the teensy module and get the acceleration values g_x, g_y and g_z.
  • I am able to connect the pulse monitor and temperature sensor to the teensy module .
  • As of now, the teensy gets the data from the accelerometer and pulse sensor creates a string with starting and ending bits to send via Bluetooth.
  • The parsing by of the info done by the computer is not correct as of now. While the information being sent is correct and the computer is able to identify the starting and ending bits of the information array being sent (and hence able to separate out different pieces of information correctly), the information itself is not being parsed correctly. So instead of seeing integers that represent acceleration in the different directions and the heart rate, we see weird characters.
  • Goal: Correct the parsing of the info.

 

Leave a Reply

Your email address will not be published. Required fields are marked *