This week, I spent a lot of time at the site of testing our device(my teammates’ house), and managed to accomplish a bunch of things regarding the actual protocol of the image transfer.
1) Capturing Image over a sequence of time and preprocessing
This base code was written and tested. It takes an image of the clutter zone, at an interval of every 5 minutes. This interval can be changed according to preference. And then, it crops the image so that only the zones we care about sent to the hub for processing.
2) Disconnecting the RPi Zero W from the internet, and connecting it to the Hub’s wifi network
For this step, I removed the SD card from the RPiZeroW and rewrote the WPA supplicant. It took about three tries for me to get the supplicant right.
2) Sending Image to the Hub
This part took a while. Initially, I was unable to connect with the server that was written by my teammate on the hub. Took us a while to figure it out, but what was happening was that our hub’s IP address was 10.0.0.0, so the RPi would refuse to connect to it as, the IP address ended with ‘0.0.0’ Once a connection was established, initially I tested sending string messages to an echo server my teammate wrote and managed to connect to it. After that, I started writing my own server and client to send and receive images, and send ACKs to the Camera. I am currently able to send the cropped image, but sending the full image corrupts it, and I am looking into that.
3) Functions for Image manipulation
Since image manipulation is going to be integrated into the Hub. I have been writing functions from the code I already had, so that Jeffrey can just call them at the right places, and doesn’t have to do much.
Besides that, I worked on the ethics assignment. The readings were surely very interesting. I had a fun time reading them.
This coming week, David and I will be looking into getting the exact position of the object on the counter and associating it with the user.