Team Status Report for 3/30/24

When we were all working together on Friday, one issue we noticed is that after switching to the 5V driver from the Arduino Nano, when spinning the motors, we observed that they were not spinning the same speed. This is a significant risk in that if there is disparity between the motor speeds, because of our belt setup,  it would affect the manner in which our robot turns, making it not dependable or reliable. To mitigate this risk, we have two potential avenues to pursue: the first is through tuning the commands given by the microcontroller to make sure that the robot can indeed drive straight, thus allowing us to maintain the same speed for the motors manually through tuning. The second way of mitigation is through using rear wheel drive only and switching to casters on the front wheels. This is because the belt tension is causing undue force on the motor shaft, causing it to spin slower. If we convert to rear wheel drive, it removes the need for a belt in the first place.

A change made to the existing design of the system is the switch from using a Raspberry Pi Pico to an Arduino Nano. This is necessary because it allows us to drive a 5V logic as opposed to a 3.3V logic. The change does not incur any additional cost because the Arduino Nano was provided free of charge.

For an updated schedule, we are hoping to target this week to be able to drive the rover and control the servos for the arm, even if it’s a basic program to test functionality.

This video link showcases our currently assembled rover so far (sans-camera), with the motors successfully wired up and spinning!

https://drive.google.com/file/d/1zICyOJkQBSxv6ApgS1hE1o7wqdp9SjWX/view?usp=sharing

Nathan’s Status Report for 3/30/24

This week saw significant progress towards the main control loop of the pick up mechanism. After making the plan last week to figure out the manner in which we align the arm to pick up items, I implemented it this week. Specifically, our system works like this: when I receive an array of detections that the camera detects with its onboard camera, I check if the z distance is within our desired pick up range to see if we are in range and in scope of an item. Then, I do a series of x-coordinate checks to ensure that the object that we are near is within a threshold that is not yet decided. If our frame has the object to the left – meaning our camera is right leaning, we print “Turn Right”, and vice versa for a right leaning object. This print statement occurrence can be adapted to send a signal to the arm controller code. This hasn’t been set up yet, but the underlying infrastructure is there which will hopefully make this connection easier. Additionally, the threshold I mentioned earlier will be calibrated once we do rover tests with the camera attached to its intended location.

A video showcasing this functionality is linked here: https://drive.google.com/file/d/1XJyA2q35H8Kpg9TzOHVndv2On-Wu5Cji/view?usp=sharing

The photos that show this functionality can be seen here:

Additionally, I ran this code off of the Raspberry Pi as well and the speed of the program did not seem to suffer a performance hit after transitioning to the RPi. Thus, I am increasingly confident of our system to work successfully on Raspberry Pi.

After implementation this week, my progress is back on schedule. To further catch up to the project schedule in this coming week, I hope to be able to integrate with the rover to determine the proper threshold amount to coordinate with the arm. In addition, I hope to be able to write the communication protocol with the arm controller code and potentially be able to receive inputs on an arm-controller program.

Varun’s Status Report for 3/30/24

This week was a lot of physical hardware work.

Firstly, the encoder connections on the motor, which is required to drive the motors are raw 28 gauge wire. I needed to fix that in order to interface with them with the microcontroller to control the motors. As such, I soldered some spare DuPont connector wire on:

Once that was settled, the next task was to get the Raspberry Pi Pico to connect and drive the motors. Right? The Raspberry Pi Pico, right? Wrong. Turns out, with the H-bridge driver that can run the motors, one cannot use 3.3V logic to drive it. Pi Picos, being a modern microcontroller, output 3.3V logic, which indeed is not compatible to drive the motors. So, we switched to an Arduino Nano. Unfortunately, the ones I could get my hands on quickly were the cheap off-brand ones that don’t come with a pre-flashed bootloader and with a chip that indeed does not interface out of box with the Arduino software on Windows 11. After connecting an Arduino Uno to the Nano and flashing said bootloader onto the Nano, I was STILL unable to connect. The final error? Windows 11. I had to click the ‘Upload’ button in Arduino, then open up the Serial Monitor and make it a foreground task (for some reason??), and then, ONLY THEN, did the code upload. I found that neat little trick on a random reddit thread. So, the motors moved, finally. It was around this time that a fellow RoboClub member saw my usage of the 2S LiPo, and cautioned heavily against using it without a fuse as a battery-protection circuit. Luckily, we have those lying around in RoboClub, so I grabbed one and began work on the fuse box. Finally, here is what it looked like.

That pair of wires took an embarrassingly long amount of time to solder. Lastly, I designed and printed the box that would go around these wires, such that something like a random Allen Key couldn’t be accidentally dropped on the fuse and melt it unnecessarily.

With that all complete, I finally made the Rover wheels turn, safely this time. Link here:  https://drive.google.com/file/d/1zICyOJkQBSxv6ApgS1hE1o7wqdp9SjWX/view?usp=sharing

The code for the correct driving will be developed tomorrow, once we figure out an appropriate serial protocol to communicate actions across the entire control-to-Rover pipeline.

With that done, it was time to turn to how we could make the Rover Raspberry Pi receive things to drive the Nano. Hayden had already figured out a way to use his existing Nano to send signals over Serial to his laptop. We re-flashed the Raspberry Pi 4B to use a lighter OS instead of Ubuntu, so it wouldn’t be lagging all the time. Once we did that, we were able to set up communication across the CMU-DEVICE network. We realized that as it stood, the communication was much too laggy to meet our defined time, so we wanted to see if it was a hardware or comm protocol issue. To test, I set up my laptop to be the server socket side and registered it on CMU-DEVICE, after which point we tested communication from Hayden’s to my laptop. The latency instantly disappeared, so I think we’ll need a better Raspberry Pi 4B. (We ordered one with 2GB RAM but got 1GB, so we’ll need to figure that out).

All in all, a very productive week. We are much less behind, and at the rate that we are finally able to go at, we are targeting Wednesday to drive the Robot AND control the Servos. (we’re targeting driving for Monday, which is definitely possible.) By next week, I want a clean implementation of all of the electronics of the Rover drive and pickup capability. The next target after that is to finish encoder integration, after which point we can finally finish with the camera integration. I still want at least two weeks for doing full on testing and tuning, and I think on this timeline, we are still set to do that. Lots of work to be put in this week.

Hayden’s Status Report 3/30/24

This week I accomplished a lot; I was able to interpret the serial USB data on my laptop and convert it into inputs for python code. This is going to make the project a lot easier from here on out now that we have access to the useful data we want to transmit. Unfortunately, the Raspberry Pi we requested is too slow because it only has 1 GB of RAM so we had to demonstrate the communication laptop to laptop. On Friday, Varun and I were able to get the motors moving and we worked on the electronics for the drive train; we also 3D printed a few more parts that are necessary for our rover. I am confident that we will be able to display full driving capabilities for the interim demo on Monday which was our goal. Overall, I am in a significantly better place than I was at this time last week; I do not think I need to make any further changes to my required tasks and timeline.

In the next week I will be completing the full communication of the driving with the proper Raspberry Pis and have a prototype on the user side for entirely controlling the system. After I complete this I will be able look into monitors and hopefully assess the Raspberry Pi to make sure it has enough processing power to relay video information while keeping the latency low.

Below is a video of two laptops communicating using the test PCB.
VIDEO LINK

Here are some screenshots of the current state of the basic code.
Changes to the Arduino Code

Transmitter Side/User Side Code

Receiver/Server Side Code

Nathan’s Status Report for 3/23/24

This week, together with Varun, we finalized the control loop and main function for how the rover should detect and position itself for the kinematics to be successful. To reiterate, it involves a series of four steps:

1) The user drives in front of the object in frame

2) It detects if it finds an object under the distance

3) Track until x distance from object to the camera is the same as camera and arm

4) then we can pick it up

For step 2, this would involve utilizing either a sweep of the frame to find a block that meets our minimum distance and then adjusting the x-value from there, or utilizing an object detection algorithm combined with depth measurements to automatically detect the object on the screen. In the latter case, classification is not necessary since there will only be one object in our immediate radius as defined by our scope.

The progress I have made on the code can be found at github.com/njzhu/HomeRover

My progress is aligning with the completion of the fabrication of the rover and development of the kinematics scheme. To match this project schedule, I will be in close coordination, developing clear, explicit interfaces of information passing to ensure integration goes smoothly.

In the next week, I hope to get a preliminary version of the control loop working, outputting the necessary information that the other modules require.

Varun’s Status Report for 3/23/24

This week was quite busy!

First things first, I finalized the kinematics scheme for the Rover, with the help of a Roboclub member who knows kinematics better than I do. We figured out that when the user gets to the object, the Rover should rotate til the line that the arm sits on is in line with the object to be picked up. The next step is to use the Y and Z distance from the Rover to manipulate the Robot Arm to pick up the object.

And finally, we got wheels and the arm on the Rover. This is what I wanted to be done by the end of this week, which is great. I plan to add more electronic holders to the Rover, but I really wanted to make sure that we could get the arm mounted to test it. Having the motors mounted as well, we can get the circuits done and test soon.

Progress is looking better. As you know, we are essentially pulling software tasks from the future and doing them now, so we can make up lost time. (Rover Fabrication is definitely taking awhile!) With this, I can finally start testing kinematics appropriately, and we can plan to get the tele-op portion of the Rover working by the midterm demo day. By next week, I hope to have all of the electronics + tele-op working on the Rover, and that is where mine and Hayden’s areas will cross at last.

Hayden’s Status Report 3/23/24

This week I received the PCB and debugged it and everything seems like it will work in our context. We do not like the buttons we chose so our final design will use a different footprint for buttons. I also figured out we need to use an Arduino Nano ESP32 rather than an Arduino Nano Every; this is because of the USB protocol and wanting to use this for the python code I will be writing on the user side. I also made the decision to write the code in Python on the user side because it will be relatively basic and has a keyboard interrupt library.

We need to simplify some of our stuff because we are about a week and a half behind as of right now and I decided that simplifying the code for sending signals to the rover is going to help us make up some time. We have also finished fabricating all the necessary parts for the drivetrain and we are currently assembling the rover and should have it complete as soon as we receive the wheels.

Below is a video of the PCB working with the serial monitor as well as a photo of the PCB. Sorry about the focus on the camera I don’t have a tripod to hold it. To see photos of the rover assembly see Varun’s post; however I also attached some of the print files.

Video of the PCB


Photo showing the USB setup of the PCB


Photo showing the print for the dead shaft mounting bracket and the live shaft gears.

This week I will be cutting out the washers, finalizing the rover construction and writing python code for the user side that will transmit from one Raspberry Pi to another. I also need to make a decision what language I will be writing the driving code in that will meet our design requirements for response time. I will order the second Raspberry Pi as well as a monitor so I can finalize the controller before the interim demo.

Team Status Report for 3/16/24

After doing a post spring break evaluation, the most significant risks that could jeopardize the success of the project revolve around the camera, both in terms of its function and the dependencies that require its outputs, such as the kinematics calculation module. Despite having a pipeline that can detect depth and display a live camera feed smoothly on my (Nathan’s) laptop, when using X11 forwarding, the resulting feed was extremely slow and laggy. Our plan to manage and mitigate this risk is to get our RPi monitor as soon as possible to test on actual usage as well as look for any opportunities to lower bandwidth and latency. The Luxonis documentation has benchmarks to test these values so we can analyze if we have any shortcomings. Additionally, another risk that stems from the first risk is the fact that we are behind on our timeline. However, we have recently placed orders for pcbs this week so for these fabrication tasks, we have controlled what we can control in terms of timing. This week saw a lot of fabrication of parts, so our next weeks will see an abundance of integration and in-person meeting time.

No changes were made to the existing design of the system and it is consistent with the change made after spring break.

Although no official changes to the schedule have been made, we are systematically cutting things out from the future and trying our best to push forward in terms of days.Here are some parts Hayden and Varun made!

Nathan’s Status Report for 3/16/24

For my progress this week, it was threefold.  The first aspect of this week involved trying to successfully boot up the Raspberry PI and SSH into it in a headless fashion since that is how it will be operating on the rover. However, I ran into some trouble earlier this week when I broke a microSD card while attempting the first boot, so progress on this front was delayed until the end of the week. Towards the end of the week, I successfully managed to SSH into the PI and register our devices MAC address to CMU-DEVICE.

The middle half of the week saw me spend pretty much all of Thursday doing the ethics assignment, and I found both readings extremely insightful.

The last part of my week involved installing dependencies and running the depthai software on the Raspberry Pi remotely through SSH. After doing research on the ColorCamera issue as stated in previous status reports, I might have found my issue, which is that when the device is connected to the Pi, and when it reconnects back to the host after performing inference, it is using a USB2 connection instead of a USB3 connection. One potential way to solve this is to reduce the data stream, but as of now I am unsure how to do this.

Currently, I am working on two different pipelines. MonoCamera -> Object Detection Pipeline and a depth pipeline that uses a colormap where I am trying to extract coordinate data and potentially link it with the Object Detection pipeline.

Currently, progress is behind because there are still issues with the camera being laggy on the Pi side as well as continued development that needs to be done with my two pipelines. With no exams this coming week, I hope to catch up to the project schedule by utilizing the week’s free time effectively.

In the next week, I hope to develop a pipeline that takes the closest depth value in an image as well as find a method to reduce data rate by potentially editing a bandwidth or FPS setting.

Varun’s Status Update for 3/16/24

This week’s theme is fabrication (and ethics)! I’ll break down my week in terms of the days-

Wednesday (evening) – I spent the evening finalizing the internals of the servo box. On Tuesday morning, I realized that my original plan for assembling the servo box was inefficient, which was to use a long M3 bolt with a captive nut on the other side of the box to hold it together. I realized that if I just used a heat-set insert on the top of the larger half, I could reduce the bolt size and therefore reduce the likelihood of any shear stress snapping the bolts in half.

Thursday (evening) – Most of the day was spent on the ethics assignment, but I also found the time to print and verify the fit of the servo first stage with the servo.

Friday (whole day) – During the first part of the day, I went to TechSpark and waterjetted the parts. We were intending to waterjet on both Monday and Wednesday during class, but there were people using the waterjet during the class time; the part would take too long and eat into the next class. I began waterjetting the part on Friday morning, and much to my annoyance, there were multiple hurdles within TechSpark to get the actual board cut out; however, with an hour-and-a-half delay, I finished cutting what will serve to be the base of the Rover!

Finally, Hayden and I went back to RoboClub, where our parts were 3D printing, and ironed out some imperfections with the board. (there were some holes that for some reason hadn’t been cut out by the waterjet, and the nature of the material had created some imperfections that affected the way the board sat on the table.) The rest of the day was spent printing out the rest of the parts, and one-by-one verifying that they would fit together. Here they are! (the third stage is printing as I write this). I’ll be assembling the arm tomorrow.

We are still behind, but if we finish the arm by tomorrow (as planned), we can make up time next week (by expediting Rover manufacturing and doing integration earlier than anticipated.) By next Sunday, the physical Rover needs to be done. In my downtime, I’ll be working on verifying motor encoder functionality.