Omkar’s Status Report for 12/10

This week, we focused on making our pickup and dropoff as robust as possible. Saral and I worked on modifying the electromagnet PCB to try to discharge the inductor faster by adding a resistor in parallel with the inductor. That did not seem to work very well, but we added some electrical tape between the pallets and the electromagnet, which worked well for disrupting the magnetic field a little bit so that the pallet does not stay magnetized to the electromagnet but not so much that the electromagnet cannot pick up the pallet. I also modified the code for sending the command to enable and disable the electromagnet to the robots, so they activate the electromagnet earlier, and the system is more fault tolerant to dropped packets on the network. We also added a capacitor on the 5V rail to prevent the MCU from restarting when the electromagnet gets enabled, allowing us to extend our battery life. I helped Prithu debug and implement the new task planner, which uses the distance from pallets to robots as a heuristic. Finally, we filmed videos for our demo on Monday in case we have issues in Techspark.

Omkar’s Status Report for 12/3

This past week, I worked on optimizing the visualizer by trying to computing the arrow vectors across the multiple processes. However, we ran into many issues with variable sharing across parallel processes in python. We ended up using python’s multiprocess shared memory manager for the different processes, but we found that the bottleneck was actually rendering the image after computing the poses for the robots and the paths. We solved this by downsampling the visualization image and scaling all our arrows by the same ratio to match. I also worked on minor bug fixes on our software stack, like modifying our infinite control loop to exit if all the pallets have been dropped off. I also implemented a timeout in the robot firmware where if it hasn’t received a servo command for over a second, it detaches the servos so that the robot stops.

The rescope from our original proposal is just limiting the number of robots we have on the field to 3 robots.

Omkar’s Status Report for 11/19

This week, we got two robots working with pick-up and drop-off working. I re-tuned the PID values on our robots so that they follow the path much more closely than before (we’ve previously had issues with the robot being unable to correct its y-error). I fixed a bug in the visualizer where old target positions were being persistently displayed. I am still working on a way for our robots to move backward after they have dropped off the pallets since I programmed the controller to only drive the robots forward along the path. So on the backup path, the robots try to turn 180 degrees and then move forward instead of backing up and then turning 180 degrees. This difference causes our robots to sometimes hit the pallet that they just dropped off. Our progress seems to be on track, and we are trying to robustify pickup and dropoff so that the pallets are consistently being picked up and moved so that there are no collisions between robots. We are very close to our MVP, and hopefully, we will get to a place soon where we can start running validation and verification tests.

Omkar’s Status Report for 11/12

We did the interim demo this week and learned that we will probably need our own router to combat other people’s traffic on the network. We also need to improve the PID gains to correct for the y-error. I worked to improve the reliability of the pickup and tried to fix issues in the code where we’ve hardcoded for 1 robot. It seems to be simple fixes like dynamically changing the robot URL.

Omkar’s Status Report for 11/5

This week, we met on Sunday to work on getting the robot to follow a path from the path planner. We were able to tune PID values such that the robot could follow the path with reasonable accuracy (may have to do more tuning later though). We noticed that the communication latency was causing our frame rate to drop from ~25fps from the CV to ~5fps. I prototyped using socket communication for lower latency, but ended up running into out of memory issues on the ESP8266 due to malloc overwriting the heap. I switched to writing a multi-threaded application, where the main program does the CV, path planning, and controls generation, and there are separate children threads that take the controls commands and asynchronously send the commands to the robots. This brought our frame rate up to ~14fps. We also noticed that the cubic interpolation caused large changes in theta error at the waypoints due to an aggressive orientation change. Saral and I worked to debug this and switch to a cubic hermite spline interpolation with a downsampled path on the straights to arrive at a more gradual interpolated trajectory.

Omkar’s Status Report for 10/29

This week the entire team met and worked on the integration between computer vision, path planning, and controls. We were able to command robots based on the trajectory, but we ran into issues with the different coordinate systems that the computer vision and controls used. We fixed those problems, but our controls code worked in simulation and not when we tried with real robots. The issue seems to be how the PID values are tuned, and trying to find a balance between driving too slow or too fast. We seem to be on track by our Gantt chart. By next week, we want to be able to have a robot follow a path as closely as possible and implement pallet pickup and dropoff.

Omkar’s Status Report for 10/22

Since the last status update, I have worked primarily on the design report. I also refactored the controls software on the main computer with a better interpolation between the waypoints (matching the design report). The interpolation is to compute the feedforward term for our controller as well as for the target pose for the robot for the feedback term. Previously I did the interpolation as a cubic spline between the waypoints, enforcing the orientation at each of the waypoints, but this did not account very well for when the robot stayed in place for a period of time. This led to a refactoring of the controls code to compute the trajectory between waypoints in real time rather than precomputing the trajectory when the controller is initialized.

Omkar’s Status Report for 10/8

This week, I presented our design proposal and explained how the robots interact with the different components in our software stack. We got the robot controls working with the computer vision code to have a robot move in a straight-line trajectory with both a feedforward term and a feedback term (only a proportional controller). This controller took in the desired next pose from the spoofed path planning and the current pose from the computer vision and outputted the speeds of the two servos. This control scheme was able to reject disturbances in the environment in the form of a person pushing the robot (Video is here). I worked on taking an arbitrary path from the path planning module and computing the feedforward term within our newly defined software interface design. My section seems to be ahead of schedule. By next week, we should be unit testing and integrating the controls, computer vision, and path planning modules to have a single robot follow a given path. I also should work on implementing the controls for picking up and dropping off pallets.

Omkar’s Status Report for 10/1

This week, I wrote the firmware for all of the different peripherals of the robots (Screen, LEDs, Servos, and Electromagnet). I also started the communication firmware that allows a computer to send POST requests to the robot. I brought up one robot in its entirety. The other robots are working, but some of the neopixel LEDs are not soldered properly, so only a few of the LEDs turn on for the other robots. I also worked on our design presentation slides, which I will present in the coming week. Our project is on schedule – we are planning on meeting tomorrow to determine the camera mounting and interface the robot firmware with the computer vision to have a better way of controlling the robots since we found out that the servos have a hard time driving straight – either because the servos are not exactly the same or because the PWM on the ESP8266 is driven by software interrupts. By next week, I aim to have a more robust communication framework in place and start on the controls software to get the robots to follow a straight line and hopefully a more complicated path.