Emma Status Report (3/15)

This week, I have spent my time developing the computer interface with Tkinter and the back end connection between the GUI and Raspberry Pi. I am using a socket server to connect the Raspberry Pi and computer, which allows the computer to send commands to the GPIO pins that control the motor, and accept frames from the camera for the live video feed. I decided that the motors would be controlled by a button on the Tkinter. When the user wants to go forward, they would press the turn on button, which turns on the motors. The button is tied to a function in python, that sends a signal to the Raspberry Pi to turn on the GPIO pin connected to that motor. There is a stop button that turns off the motor in a similar fashion. Right now, I have only implemented one motor for testing the connection between the Pi and computer. I implemented the control of the motors before figuring out frame transmission.

For the live video stream, I am running into bugs with the code. At first, the image frames were not getting to the computer. I realized that the client was not sending requests for frames consistently to the Pi, so I created a “Start Live” button on the GUI for testing, that sends requests to the Pi for frames. Then, I was running into an issue where the client would receive one image, then the program would freeze. I was able to debug this and realized that I was only sending one frame on the Pi server end, so I put the frame sending functionality into a loop. Now, there is live streaming from the camera to the computer. The frame rate is significantly better than when we controlled the camera using other methods, which is a big improvement. We were concerned before about the frame rate being too slow, but now we are confident that this frame rate would meet our design requirements (prior to testing). 

There are still bugs in the interface that need to be fixed. Right now the Pi is sending frames well, but it does not get and process requests to change the GPIO pin status. I am still investigating the cause and coming up with a solution. I might need to separate the motor control and live stream onto two port connections, or slow down the frame transmission to recognize other requests. Also, I think the frames being sent are not holding all of the pixels correctly, because the color is not coming through correctly. It makes reds look purple, and the lighting is fairly dark. I will have to edit the client code that accepts the frames to make sure it takes the whole image without stopping short.

Next week, I will continue to try and debug the interface and work with Maddie to integrate the Z direction movement. As a team we are also planning on putting together the above water boat. I am still on schedule, and do not anticipate any risks moving forward. I think that the code will just take some more time to finalize and implement the functionality we desire.

Link to Video of Interface:

https://docs.google.com/videos/d/1ms-IpJHWIhEZ1yQMVdBUI3owfcwymd0b25iK6jW3Zs0/edit?usp=sharing



Leave a Reply

Your email address will not be published. Required fields are marked *