WORK ACCOMPLISHED:
This week, I worked with Jason on the VESC, Motors and Raspberry Pi. We spent quite a bit of time debugging the connections between the Pi and VESC. Eventually we abandoned using the GPIO pins in favor of using USB connections. We will be writing the scripting for the motors using a conjunction of the PyVESC and serial python modules. In similar fashion, we plan to forego the Qwiic connectors and use a USB cable, UBlox_GPS, and serial to manage control the GPS.
We’ve achieved individual control over the motors, but we’re still trying to execute instructions simultaneously. I plan to use Python’s multithreading features to accomplish this.
The LiDAR is not yet fully operational on the Pi. Due to its Linux environment, we have to build the SDK from its source files rather than just installing it.
PROGRESS:
I am on track with my tasks for the week, which focused on defining a Python Class for the motors entailing all the methods we will be using to control the motors. We initially planed to have return completed by this week but that was an ambitious deadline that was set to leave extra cushion time towards the end. We can slightly relax our expectations, especially because the motor commands are taking much less time than anticipated.
In experimenting with the LiDAR I’ve found that its depth camera performs very poorly outdoors. It’s RGB camera is fine however, and already capable of identifying faces and outlining them in a red bounding box. This gave me an idea to propose relaxing our vision requirements to the recognition of a single object (e.g. a traffic cone) as a proof of concept. This is fairly easy to accomplish using opencv and a public data set for recognition.
NEXT WEEK’S DELIVERABLES
- Rewrite GPS data collection on the Raspberry Pi to use USB serial ports.
- Complete the migration of LiDAR code to the Raspberry Pi environment to ensure seamless operation
- Begin testing the return algorithms once the board is fully assembled, using LiDAR data to initiate brakes.