Echo Gao’s Status Update for 11/21

This week I was mainly working on the  hardware part, namely, how to fix the camera onto the robot so that it catches the person in its view. I used CAD SolidWorks to design the basic structure for the camera bracket and then import the file into 3D print. The 3D printer I used at home is called Longer 3D with transparent resin. It is not as accurate as the one at school but due to COVID-19 I decided to work at home. The main constrain of this 3D printer is that the shaft diameter and attaching points cannot be adjusted based on locations on the object. Therefore, when I test printed the first time, shaft diameter was too small and the entire structure got distorted and shifted. When I test printed the second time, I raised this parameter, but then the supports became harder to pill off. And each test print takes around 1-3 hours depending on the setting of the supports. After pilling off the supports, I used sand papers and other various tools to sculpt the two handles on this bracket so that they fit the holes on our USB camera.

Yuhan Xiao’s Status Update for 11/14

This week I helped Echo and Peizhi connect with and expand on the code snippet I wrote last week. The code snippet can receive and delete messages containing alarm data from the web app server in real time. For now it is live at http://3.137.164.67:3000/(only when I spin up the server). Echo and Peizhi expanded on the snippet so it can initiate the robot sequence when a message is received.

I helped debug authentication and connection issues with AWS SQS when running the snippet on the Pi. I also helped modifying API endpoints according to the requirement of the Create 2 Open Interface so the data received on Pi can be directly used for ingestion.

I set up IAM users for both Echo and Peizhi, just in case they need to access web server or SQS console when they are debugging in the future.

I tried to move web server from Amazon Linux machine to Ubuntu in an attempt to deploy the website so it is always live, using Nginx, PM2, etc. It did not work  in the end after a few hours of debugging. It might have something to do with my project file structure. I might look into other services that help deploy MERN stack application, or Node.js application, or readjust my overall file structure.

I am on schedule.

For system design changes: I am considering switching MongoDB out for Redshift/RDS/Aurora. Right now I have the full working implementation using MongoDB, but I am concerned it might be an overkill considering the simple structure and light load of the data we are working with. Hence I am considering switching from NoSQL to SQL databases.  And using databases provided by AWS family like RedShift might be easier for integration. I can even just use a CSV file to store the data.

This week(week of 11/9 – 11/15):

  • prepared for interim demo on Wednesday
    • cleared and prepared sample alarm and schedule data
  • deployed current implementation to EC2(Ubuntu)
  • tested code snippet on Pi
  • supported Echo and Peizhi in the integration of the code snippet into the main program running on Pi
  • set up access for Echo and Peizhi for easier debugging

Next week(week of 11/16 – 11/22):

  • continue testing communication between Pi and web app
  • modify communication code snippet on RPi so it writes to files instead of printing to stdout(to keep a record of messages & when they are received, to validate reliability of the alarm scheduling)
  • implement alarm and ringtone deletion
  • research on and implement database change

Week of 11/23 – 11/29:

  • continue testing communication between Pi and web app
  • implement ringtone creation by importing MIDI files

Team Status Update for 11/14

Echo & Page

Implemented a script on the Raspberry Pi and achieved communication between the back end of the web application and the Raspberry Pi. The Raspberry Pi now will receive a message from the back end at user specified alarm time. It will also receive and play the user specified ringtone after activation.

Implemented wall detection in our final script. The robot will halt and get ready for the next request from the server after detecting a wall based on our own algorithms.

A portable power supply is tested to be working and we are now able to put everything onto the robot and start final optimization and testing.

We are on schedule.

Yuhan

This week(week of 11/9 – 11/15):

  • prepared for interim demo on Wednesday
    • cleared and prepared sample alarm and schedule data
  • deployed current implementation to EC2(Ubuntu)
  • tested code snippet on Pi
  • supported Echo and Peizhi in the integration of the code snippet into the main program running on Pi
  • set up access for Echo and Peizhi for easier debugging

Next week(week of 11/16 – 11/22):

  • continue testing communication between Pi and web app
  • modify communication code snippet on RPi so it writes to files instead of printing to stdout(to keep a record of messages & when they are received, to validate reliability of the alarm scheduling)
  • implement alarm and ringtone deletion
  • research on and implement database change

I am on schedule.

Problems: refer to personal update.

Echo Gao’s Status Update for 11/14

This week, we finished: 1. integrate raspberry pi with Web app 2. Add final stop motion(wall encountered) to our entire program 3.demo in class

We coordinated with Yuhan on how raspberry pi and her website would interact. Raspberry pi now is always ready to accept messages from the web. User can setup alarm time and ringtone/song on our AWS website. When it is the scheduled time, the server will send this specified ringtone to our raspberry pi. Pi will parse the message, execute the shell command to wake the robot up, play the song, and run.

(set up time on web)

(received message from server)

This week we also integrated our “stop action” algorithm we developed last week to our main algorithm. Now, the robot is fully functional. On the hardware side, power bank for the raspberry pi we ordered was also delivered. Now pi can be fully mounted on top of the robot. We have not yet find a way to fix the camera in a better& more elegant manner.

Next week, we will start on testing, optimizing algorithms, and finding success rates.

Yuhan Xiao’s Status Update for 11/7

As planned last week, this week I

  1. implemented the backend of web app, using
    1. MongoDB – for storing ringtone and alarm schedule permanently
    2. SQS – for sending data to Pi when an alarm is created
    3. node-cron library(nodejs) – for scheduling the data sending mentioned above, at whenever the alarm time is (and repeating the sending every week)
  2. for the frontend, switched the data store from local storage to actual  database in cloud using axios;
  3. wrote a code snippet that receives simple data from the web app backend in real time, which is to be incorporated with the main program deployed on Pi.

Code for web app is here and code snippet deployed on Pi is here(private repository, let me know if you need access).

I tested my implementation by scheduling an alarm clock through my frontend at about one minute later than current time, e.g. say at 10pm. I verified that the alarm got stored permanently by checking MongoDB Atlas. By the alarm, I mean the day, time and ringtone associated with the alarm set. I verified on the “alarm schedule page” reflects an updated schedule with the new alarm. I waited for about one minute, and then a message with the alarm data was sent from web app backend to SQS queue at exactly 10pm. Then a message is received a few seconds later, by the code snippet also running on my machine.

One change to my system design is that I will be scheduling alarms on web app server instead of Pi, since I noticed the node-cron library supports repeating jobs every week, which is exactly the frequency of a defined communication between web app and Pi. The library also seems to have a solid user base and well-written documentation.

The coding turned out to be a bit more intensive than planned, so I did not get to test the snippet on Pi, or verify if the same alarm can be reliably started the same time one week later. But overall, I am mostly on schedule.

In the next few weeks before the final demo, I am planning to set up a testing/measuring workflow(i.e. reliability/robustness/latency of communication between web app and Pi), and add more operations to be supported by the web app(e.g. alarm/ringtone deletion, ringtone creation by importing MIDI files).

This week:

  • implemented backend with MongoDB
    • web app data pulled from & stored to cloud
  • implemented communication between web app and Pi with SQS(message queue)
    • send messages on web app backend(javascript, scheduled by node-cron), receive and delete messages on Pi(python)
    • did not test it on Pi, used local machine instead

Next week(week of 11/9 – 11/15):

  • prepare for interim demo on Wednesday
    • deploy current implementation to EC2
    • clear and prepare sample alarm and schedule data
  • continue testing communication between Pi and web app
    • test code snippet on Pi
    • handle connection errors gracefully, especially for receiving messages
  • support Echo and Peizhi in the integration of the code snippet into the main program running on Pi
    • figure out a way to host the web app with minimum down time, and maximum secure access to EC2, MongoDB and SQS for Echo and Peizhi for easier development and debugging(e.g. maybe setting up separate user accounts for them)

Week of 11/16 – 11/22:

  • continue testing communication between Pi and web app
  • implement alarm and ringtone deletion

Week of 11/23 – 11/29:

  • continue testing communication between Pi and web app
  • implement ringtone creation by importing MIDI files

Team Status Update for 11/7

Echo & Peizhi

This Week:

  • Installed heatsinks and a 3.5-inch screen that’s mounted in the shroud for our raspberry pi 3.
  • Implemented wall-detection algorithm through sensor data and self-developed algorithm
  • Optimized obstacle detection and avoiding algorithm

Problems:

  • Purchased wrong product. Expected a power bank, turned out to be a power adaptor
  • Displaying captured video on the 3.5-inch screen, won’t fit even with 100×150 display window

Next week:

  • Integration for web application and hardware platform
  • Further optimization & Mounting Pi and camera onto the base
  • Purchase power bank

We are on schedule.

Yuhan

This week:

  • implemented backend with MongoDB
    • web app data pulled from & stored to cloud
  • implemented communication between web app and Pi with SQS(message queue)
    • send messages on web app backend(javascript, scheduled by node-cron), receive and delete messages on Pi(python)
    • did not test it on Pi, used local machine instead

Problems: Refer to my personal update.

Next week:

  • prepare for interim demo on Wednesday
    • deploy current implementation to EC2
    • clear and prepare sample alarm and schedule data
  • continue testing communication between Pi and web app
    • test code snippet on Pi
    • handle connection errors gracefully, especially for receiving messages
  • support Echo and Peizhi in the integration of the code snippet into the main program running on Pi
    • figure out a way to host the web app with minimum down time, and maximum secure access to EC2, MongoDB and SQS for Echo and Peizhi for easier development and debugging

I am on schedule. For the plans for the next few weeks, refer to my personal update.

Peizhi Yu’s Status Update for 11/7

This week, we were mainly dealing with 1. mounting hardware pieces together 2. implementing stop action for the robot. 3.adjusting the size of camera display on LCD screen 4. fixed the sensor data error we fixed last week

We started off by adding heat sink on raspberry pi and put it into a case. A mistake we made was that we realized the power pack we ordered was not what we expected. Therefore we need to reorder a new one, which dragged us a bit behind our intended schedule. (Without the power pack, raspberry pi cannot be fully mounted on top of the robot. But everything else were in place by now.)  Next, we spent a long time implementing the final stop action for the robot, which was not intended. That is, when the robot hits the wall, it should turn off. There is a “wall detection signal” in its sensor packet, which we thought could be used to accurately detect the wall. Yet that builtin function does not work as expected. So we need to figure out a way to make the robot distinguish between an obstacle and wall. If obstacle is detected, robot should rotate until the obstacle is not in its view and continue moving. If wall is encountered, robot should shut down. By looking at the sensor data values returned from the robot, we found that when wall is detected, 4 out of 6 of the sensor value will be greater than 100. The best approach was to find the median of all 6 sensors. If this number is greater than 50, we will say that wall is detected.

Last week, the problem we encountered was that if we excessively call the get_sensor function, the robot sometimes blow up and returns unintended values which messes up our entire program. This problem is solved by contacting IRobot Create2’s technical support. We followed the instruction provided, and our robot seems to be working fine now.

 

Next week, we will have our robot fully implemented with the raspberry pi mounted on top of robot so that we can give a cleaner and more elegant look of our product. We will also coordinate with Yuhan on making connections between raspberry pi and her website to setup alarm time and download user specified ringtone.

Echo Gao’s Status Update for 11/7

This week, we were mainly dealing with 1. mounting hardware pieces together 2. implementing stop action for the robot. 3.adjusting the size of camera display on LCD screen 4. fixed the sensor data error we fixed last week

We started off by adding heat sink on raspberry pi and put it into a case. A mistake we made was that we realized the power pack we ordered was not what we expected. Therefore we need to reorder a new one, which dragged us a bit behind our intended schedule. (Without the power pack, raspberry pi cannot be fully mounted on top of the robot. But everything else were in place by now.)  Next, we spent a long time implementing the final stop action for the robot, which was not intended. That is, when the robot hits the wall, it should turn off. There is a “wall detection signal” in its sensor packet, which we thought could be used to accurately detect the wall. Yet that builtin function does not work as expected. So we need to figure out a way to make the robot distinguish between an obstacle and wall. If obstacle is detected, robot should rotate until the obstacle is not in its view and continue moving. If wall is encountered, robot should shut down. By looking at the sensor data values returned from the robot, we found that when wall is detected, 4 out of 6 of the sensor value will be greater than 100. The best approach was to find the median of all 6 sensors. If this number is greater than 50, we will say that wall is detected.

Last week, the problem we encountered was that if we excessively call the get_sensor function, the robot sometimes blow up and returns unintended values which messes up our entire program. This problem is solved by contacting IRobot Create2’s technical support. We followed the instruction provided, and our robot seems to be working fine now.

 

Next week, we will have our robot fully implemented with the raspberry pi mounted on top of robot so that we can give a cleaner and more elegant look of our product. We will also coordinate with Yuhan on making connections between raspberry pi and her website to setup alarm time and download user specified ringtone.

Yuhan Xiao’s Status Update for 10/31

This week I did some research as to (1) how our remote web app server can communicate with Pi (how alarm data is passed from one to another), and (2) how alarm data is stored. As a result of my research, I decided to make some adjustments to our current system design – (1) AWS SQS is used to send data from the remote server to Pi, and (2) cron jobs will be scheduled on Pi instead of remote web app server.

My reason for change (1) is that just having MongoDB is not sufficient for real time communication between Pi and the remote server; sure, when user created new alarm on the frontend interface, data can be sent to backend and stored to MongoDB immediately, but MongoDB does not know if and which alarm is created in real time, and hence cannot pass the newly created alarm data to Pi in real time. This means instead of Python Requests library on Pi, we will use AWS SQS Python SDK. And on the remote server, AWS SQS Javascript(Node.js) SDK will be used.

While change (1) is absolutely necessary, change (2) is more up for debate and can be changed back later on. I made this change because this way we say goodbye to the potential latency between web app and Pi whenever the alarm starts. We did not plan to schedule cron jobs on Pi initially because we wanted to make sure Pi have sufficient computing power to run the CV code efficiently; I checked that cron job does not use up much resources(so does connection with SQS), and especially since only one cron job is scheduled for one regular alarm, Pi should probably do fine.

I have some pseudo code for this approach at the end of the update.

Since I spent more time on research and some more on ethic readings, plus big assignments from other classes , I have only written some pseudo code instead of actual code. As a result, I am running a little behind my schedule this week, but I will make up for it next week.

My next week’s schedule is to turn the pseudo code into actual code, with the exception of the addToCron() on the last line of the pseudo code on Pi.

On Pi:

import boto3, logging
sqs = boto3.client('sqs')
q = sqs.queue('name')
while True:
alarmData = q.receive_message()
logging.info(alarmData)
q.delete_messge(alarmData["id"])
addToCron(alarmData["time"], alarmData["ringtone"])
...

On remote server(web app backend):

// mongoDB init
var mongoose = require("mongoose");
mongoose.Promise = global.Promise;
mongoose.connect("mongodb:<LOCAL HOST>");
var alarmSchema = new mongoose.Schema({
time: String,
ringtone: Array[int]
});

// SQS setup
var AWS = require('aws-sdk');
AWS.config.update({region: 'REGION'});
var sqs = new AWS.SQS({apiVersion: '2012-11-05'});

app.post("/add-alarm", (req, res) => {
var alarmData = new Alarm(req.body);
alarmData.save() // alarm saved for permanent storage })
.then(item => {
var msg = buildMsg(alarmData);
sqs.sendMessage(msg, function(err, data) { // alarm added to queue, which Pi can read from
if (err) { throw new Error('Alarm data is not passed to Pi'); }
});
res.send("alarm created");
})
.catch(err => {
res.status(400).send("unable to create the alarm");
});

const buildMsg = (alarmData) => {
var msg = {
MessageAttributes: { alarmData }, // payload
Id: "MSG_ID",
QueueUrl: "SQS_QUEUE_URL"
};
return msg;
}
...

Peizhi’s Status Report for 10/31

Hardware portion almost finished:

After replacing our broken webcam with a USB camera, we continued our testing. This week, we combined our obstacle avoiding algorithm with our human avoiding algorithm together. Now, our alarm clock robot is fully functional with all our requirements reached. The complete algorithm is as followed: the robot starts self rotate at its idle stage while playing the song when no person is detected through camera. As soon as a person is detected through the camera, robot will start moving in the direction away from him/her while trying to fix this person’s position in the middle of the camera. Yet if obstacles are encountered, the robot will immediately start self rotate and move in the direction where no obstacles are detected through its sensors. At that point on, the robot will look back at its camera again to find the person. If the person is still in its view, it will perform the above action. If the person is not in its view, the robot will start self rotating until it finds a person again. As one might see, avoiding obstacles take priority over avoiding user. The “finish action/ alarm turn off action” will be done next week. That is, when the robot runs into a wall, the entire program finishes. A problem we encountered is that the distance information we received from IRobot Create2’s sensors to detect nearby obstacles sometimes blows up to completely inaccurate numbers. In that case, our entire program runs into undefined behavior and crashes. We have not yet find a solution to this problem.

(avoiding both human and obstacles demo)

 

(avoiding human only demo)

Integration with Webapp:

Now, we are stating to work on sockets and how raspberry pi communicates with Webapp from Yuhan’s website. Next week, we will work on how to let the Webapp controls the entire on and off of our program and how webapp sends ringtone and time to raspberry pi.