Esther Jang’s Status Report for 11/6

This week, I worked on mounting and motorizing the linear slides on the robot. Both slides were mounted mirroring one another on the chassis. The pulleys were strung with a string that was attached to the motor that would pull the slides. I found that the motor alone was insufficient for lowering the slides, so I added some tension to pull the slides down when unwinding the string by attaching surgical tube between adjacent slides.

Overall, the slides were able to successfully extend and detract with a bar mounted between them as shown in the following clip:

There are some slight issues with detracting fully which I think is related to minor tension issues. The slides were able to individually detract otherwise.

At one point in the week, one set of slides were completely stuck and required significant disassembly to resolve. In order to fix and further help avoid this issue, lubricant grease was applied to the slides.

Finally, I also tested the claws this week and found one to have significantly better grip range and torque than the other. Using the servo code written a few weeks ago, I was able to get the claw to grab a small, light box. The claw was also mounted onto the end of the slides.

Team Status Report for 11/6

This week we finished the mechanical assembly of the robot, and are preparing for the interim demo for the following week. Bhumika has mainly worked on the computer vision area, with edge detection, laser detection, and april tag detection working decently well. The linear slides system is motorized and can successfully stretch up and down. The claw is experimented fully and can successfully grab objects within dimensions. The navigation code of the robot is under implementation. Currently, the four motors can receive commands from the Jetson Xavier and spin according to direction. We will continue to fine-tune our specific areas of focus and start working on integration later in the week.

Bhumika Kapur’s Status Report for 11/6

This week I worked on all three components of my part of the project, the laser detection algorithm, the April tag detection, and the edge detection.

For the edge detection, I was able to implement edge detection directly from the Intel RealSense’s stream, with very low latency. The edge detection also appears to be fairly accurate and is able to recognize most of the edges I tested it on. The results are shown below:

Next I worked on the laser detection. I improved my algorithm so that it would result in an image with a white dot at the location of the laser if it is in the frame, and black everywhere else. I did this by thresholding for pixels that are red and above a certain brightness. Currently this algorithm works decently well but is not 100% accurate in some lighting conditions. The algorithm is also fairly fast and I was able to apply it to each frame from my stream. The results are shown below:

 

Finally I worked on the April Tag detection code. I decided to use a different April Tag python library than the one I was previously using as this new library returns more specific information such as the homography matrix. I am a bit unsure as to how I can use this information to calculate the exact 3D position of the tag, but I plan to look into this more in the next few days. The results are below:

During the upcoming week I plan to improve the April Tag detection so I can get the exact 3D location and, and also work on integrating these algorithms with the navigation of the robot.

Ludi Cao’s Status Report for 10/30

The robotics parts arrived earlier this week. I assembled the drivetrain chassis shown in the pictures below.

We are short on screw nuts, so I ordered more which should arrive in a day or two to add more securing between parts. I also tested speed control with the BTS7960 motor shield. I wrote the program such that the wheel speed changes based on the number inputted to the Serial monitor. Last week, I tested information passing between the Jetson Xavier and arduino through Serial.monitor as well and it worked. Hence, integration between the modules is promising. 

https://www.youtube.com/watch?v=VdI3qAsY-3I

Esther Jang’s Status Report for 10/30

Our parts finally arrived this week, so I spent the week building. Unfortunately, I was unable to work in-person for Wednesday and Thursday due to a potential COVID exposure. Hence, most of my work took place on Monday, Friday and Saturday.

I was able to complete building the mechanical aspect of the linear slide system (not including mounting). We did not integrate yet because we ran out of nuts for building but will receive more by tomorrow. The slides seem to extend fairly well although they were getting slightly stuck at times. I think this could easily be fixed with a bit of lubricant. The length of a single slide is 1/3 meter (full meter extrusion cut into thirds), and each set of slides is a mirror of the other.

I also investigated the motor shields with Ludi and was able to control a motor so that it spins at a specified frequency of a specified amount of time, holds, and spins in the opposite direction the same way. This should be approximately the operation we need for the linear slides.

Finally, I attempted to get the claw working with the Arduino code written last week but found that the servo required a voltage of around 6-8V to operate. We will likely need to get a buck convertor to use the claw’s servo.

Bhumika Kapur’s Status Report for 10/30

This week I was able to start working on the camera and Jetson Xavier setup. We got the Xavier setup earlier in the week, and I worked on getting the Intel RealSense Viewer setup on the Xavier. Once that was setup, I worked on downloading the necessary Python libraries on the Xavier that are needed to use the Intel RealSense data. That took me some time as I ran into many errors with the download but, the library is now set up.

Once the viewer and the library were setup I began working on actually using the camera’s stream. I ran many of the example scripts given on the Intel RealSense website. Two of the more useful scripts that I was able to run were a script for detecting depth of object in the stream, and a script for using depth image to remove the background of an image. The results are shown below:

The first image shows the result of running the depth script, which returns the depth of a pixel in meters. The x,y, and depth information is printed out. The second image shows the result of using depth information to remove the background and focus on a specific object.

Next week I plan to use these scripts in conjunction with the edge and laser detection algorithms I have written, so that the robot can use that information to navigate.

Team Status Report for 10/30

This week we received all of the parts that we ordered, and began producing our robot. We setup the Jetson Xavier and Bhumika has been working on the computer vision aspect of the project using that. Ludi was able to assemble most of the chassis. Esther build the linear slides, and both Ludi and Esther are working on getting the motor shields to work. Next week we will continue to work on assembling the robot, and integrating the different components such as the chassis and the linear slides. Overall we are satisfied with the progress we have made so far.

Ludi Cao’s Status Report for 10/23

This week I worked on the software portion of the motor controller. I completed the initial setup for the Jetson Xavier, and installed the packages we would need for our project. Since the computer vision algorithms would be run on python, and the Arduino code is in C/C++,  I wrote the communication channel between the Jetson Xavier and the Arduino.

The screenshot shows how the two entities communicate with each other when fed in some data. I also have written the logic of the drive train, specifically the omnidirectional logic for the different motors to move in different directions. I also briefly looked into the software code of adding encoders to our motors. Since the Intel Realsense cannot detect an object in a short distance frame, the drive train would need to travel a fixed distance after applying the computer vision detection from far away. Adding encoders would provide closed-loop feedback to ensure high accuracy. Since the hardware parts have not arrived yet, I am not able to tune any parameters of the skeleton code. Hence, the project is delayed behind schedule, but hopefully, the hardware parts would arrive early next week, and I am able to construct the robot and have an initial test of the motors and drive train accuracy. 

Esther Jang’s Status Report for 10/23

This week, we placed an order for our parts early this week but unfortunately did not receive any of them yet. As a result, I spent time preemptively investigating software aspects of our hardware.

For the claw, I tested controlling a servo through the PWM output on an Arduino board. Although we intend to put them on the Jetson board directly, I tested with the Arduino since I didn’t have access to the Jetson this week. I found that it was very straightforward to run a servo on the Arduino, so we should strongly consider running it there instead of the Jetson if possible.

I also looked into the encoders on the motors and found that they were incremental encoders that have a channel A and channel B pin that needs to be connected to digital pins. In order to keep count of the encoder, we need to compare the signal of the channels with each other as they are out of phase. This also means that we need 5 pins for each motor that uses the encoder – 3 for the motor and 2 for the encoder. As a result, we will likely need to use 3 Arduinos as opposed to our current plan of 2. Overall, using our encoders should not be too difficult as long as we can properly read the pins.

Team Status Report for 10/23

This week, our team finalized our parts order and placed it. We were hoping that at least some of our parts would arrive by the end of the week, but unfortunately did not receive them. We expect to have our parts by the start of the upcoming week and will invest a lot of time to build.

We also had discussions regarding ethics in class and learned a lot in terms of various ethical issues our project may have/cause. For example, we hadn’t considered our laser pointer being misused, our robot damaging the objects it grabs, and if recording others would be problematic.