Ryan’s Status Update for 10/24/2020

This week was rather busy with midterms, but nonetheless, I was able to make progress on the construction of the robot. For starters, I was able to cut two out of the three pieces necessary for the construction of the robot frame. Below are some pictures of the railing and motor mount:

Above is the railing for the BallBot runway used to guide the ball towards the back of the iRobot base. Below it is its reference 3D model.

I also was able to start construction on the motor mount. Below is a picture of the assembled motor mount along with its 3D design.

However, construction of the above pieces was very tricky, so I spent the rest of the week redesigning the parts to make assembly easier. The new design features fewer angled cuts and simpler geometry. Below is a rendering of the new BallBot design.

For next week, I plan on continuing construction of the robot. I hope that these new designs will make cutting and assembling the wooden frame much easier. If the frame’s construction goes well, I’ll switch over to cutting the acrylic pieces to make the ramp and robot top plate.

Team’s Status Update for 10/24/2020

This week we made a lot of progress on the software aspects of the project. Rashmi and Ishaan worked on integrating the computer vision algorithm, the iRobot Create 2 code, and the Intel RealSense Depth camera’s code into one piece of software. When connecting the Real Sense camera and Create 2 robot to a laptop, we were able to successfully combine all 3 parts of the software. Depending on the location of the tennis ball in the Realsense camera’s field of view, the Create 2 robot would turn to try and cause the ball to be in the center of the camera’s field of view. This is what we want so that the ball will hit the center of the robot and be collected by the arms that we are working on building.

However, we still need to work on running this integrated software on the Jetson Nano which will ultimately control everything. When trying to install everything on the Jetson Nano, we ran into the issue of the Intel RealSense depth camera’s Python library not being able to be installed using pip on the Jetson Nano. This is because installing the library with pip wasn’t supported on the Jetson Nano’s architecture. This was an unplanned setback, but we plan on fixing it this week by building the Realsense library from source which should not be too difficult.

In terms of building the robot, we had a few setbacks this week. We as a team spent most of the day measuring and cutting the word so that we can assemble the robot. However, we had tried to make cuts in wood at 45 degree angles so that we could connect two pieces of wood at a 90 degree angle. However, we realized that drilling into and connecting 2 pieces of wood that were at an angle was very difficult, especially since our cuts weren’t perfect. To fix this, we decided to make flat cuts in the wood and just connect different pieces face to face which would eliminate the issue of imperfect cuts and make it much easier to connect the pieces of wood.

Schedule Update:

Currently, we are on schedule on the software side of the project, but slightly behind on the hardware part. To make up for this, we plan on spending more time in the Makerspace this coming week to get our wood cut and start connecting the pieces together to make the structure that will support the tennis ball collection system. Then we can also add the motor control software to our code and test if all the software and hardware is working together.

Ishaan’s Status Report for 10/24/2020

This week I spent some time with Rashmi working to integrate my OpenCV computer vision algorithm with her robot motion code. As suspected, integrating the c++ with python was harder than expected. We tried using pybind but for some reason that didn’t quite work the way we wanted it to. So, to save some time, we just converted the c++ code into python line by line. At first this felt like 2 steps back but one step forward but we eventually were able to get the python to behave exactly how the c++ did. The speed was not much slower either. We performed some tests to make sure that the code did not contain any bottlenecks.

After rewriting the code, we were able to integrate the output from the camera to the iRobot controls. The robot can now move towards a tennis ball if the tennis ball is in the field of view. Rashmi is going to work on improving this algorithm so that it can use the depth output from the camera and properly generate a route.

Additionally, I also helped Ryan assemble the robot on Friday. Although we got the pieces cut, the assembly was more complicated than expected. A lot of the cuts were at 45 degree angles and screwing them together was not easy. Although we did our best, we think that we should redesign the parts so that it will fit together better.

This week, I plan to spend some time building the python library on the Jetson Nano. This is currently not working because they don’t support pip and other necessary tools that I need to get this to work. Additionally, I might also need to go help Ryan cutting and building the robot since (due to the fact that we had a setback this weekend and the makerspace does not open until Monday) we might be slightly behind.

Rashmi’s Status Report for 10/24/2020

This week I took some time to work with Ishaan on the software aspects of the project as well as worked with the entire team on the construction of the robot.

I worked with the team on the hardware this Friday. I was able to finish cutting the parts for the robot. However, as we assembled it, we realized that some of the cuts were really difficult and hard to assemble. While we were able to piece some parts of the structure together, after a severed screw, we decided that it would be best to redesign these with simpler cuts.

In terms of progress with the software, Ishaan and I were unable to integrate the computer vision code written in c++ with the robot motion algorithm that was written in python. To resolve this, we spent some time rewriting the code from c++ to python. This was a fairly easy process as it mainly only involved converting the code and not changing the algorithm. Additionally, we discovered the pycreate2 library. This library has wrappers for the open interface for the iRobot create. We were able to convert some of the more complex motion commands that I had previously written to simple use wrappers from this library. One such example is the safe wrapper. A simple call to this function makes it so that the robot drives in safe mode. What this means is that the robot will automatically halt if it feels like its going to crash into a wall. This can come in super handy when the robot is near the net or the walls of the tennis court.

Ishaan and I were ultimately able to get the robot to move towards a tennis ball! We have a base algorithm working where if the tennis ball exists in the camera view, then the robot will move towards the ball successfully. If the position of the tennis ball changes in between, then the robot will update its direction and follow the tennis ball.

In terms of my work for this upcoming week, I plan on making the algorithm more robust and better. Some this I plan on looking into this week are:

  • If there is no ball in the camera view, the robot needs to turn and search for a ball
    • Need to identify a good interval to do this check at since we can’t have the robot spinning aimlessly.
  • Start using the depth feature of the camera so that the algorithm can work for multiple balls. The simplest algorithm that I can do is:
    • Use the coordinates of the object and choose to go towards one with the smallest Euclidean distance.
    • If there are too many balls on the screen, then do this process for maybe 5 balls and pick one and go towards it.
    • Will doing this operation be too slow? How can this be mitigated?
  • Test this algorithm somewhere that is not my living room 🙂

Ryan’s Status Update for 10/17/2020

This week I focused on finalizing the designs for each component of our robot frame. This includes finalizing designs for to robot’s arms, motor mounts, ramp railings, and runway. Since we switched to wood, I had to simplify the design of certain parts to make the overall construction of the robot easier. The near-final design of the robot is shown below:

Additionally, we began construction of the robot this week. On Friday, we were able to use the maker space tools to make initial cuts in our wood. For next week, we will focus on the physical construction of the robot from the designs finished this week.

Team Status Update for 10/17/2020

This week we made a lot of progress as a team. Although nothing huge happened, we each made progress in our areas that made the whole project progress. In terms of building the robot, we were able to acquire the wood and find some time to cut it. Now that we have the cut pieces, all we need to do is assemble the frame for the ramp and wheel mechanism. In terms of the robot motion, we now have a robot that can move on command in the direction that we want it to. The motion is very granular and we should be able to make the robot move in a targeted direction. Finally, in terms of the computer vision, we were able to test the algorithm outside and tune the parameters such that the ball detection in various lighting is no longer an issue.

As a team, we met up once this week to update each other on the progress that we had made individually and also work on the design report. We will continue to work on the design report this weekend and finish it up.

Schedule Update

We are now at the stage where we need to start integrating some of individual contributions to this project. The computer vision and the iRobot motion are at a point where they can be combined. We are still mostly on track with our planned schedule. We hope to pick up the pace on building the arms of the robot to ensure we don’t fall behind. One possible risk that could occur is that the entire ramp and wheel contraption doesn’t work in sucking up tennis balls. Hopefully we can get to the testing phase of that part of the project soon so that we can fine tune the ball pick up system.

Ishaan’s Status Report for 10/17/2020

This week I worked a lot with the computer vision algorithm trying to make it more robust. I took the camera outside and tested the algorithm in daylight. At first, even if the camera angle changed a little, the ball stopped being detected. Furthermore, I was outside for a few hours and I noticed that as the daylight itself changed, I found that I had to retune the algorithm to work. However, after rewriting some of the code and tuning the parameters of the HSV color space, I believe that I have finally found a happy medium threshold where the tennis ball is detected all the time in daylight.

I also spent some time this week helping my team write the design review document. We were hoping to have feedback from the professors regarding our design process by now but we have not heard anything.

Next week, I plan on working with Rashmi in order to integrate her working with regards to the motion planning of the iRobot and my computer vision algorithm. I’m not sure how this process is going to go as neither one of us has really worked in the other’s domain. I anticipate that this process of integrating it and then finetuning the motions and fixing the issues that arise will take us this whole week and possibly some of next week.

Rashmi’s Status Update for 10/17/2020

This week, I spent some time fine tuning the iRobot moving mechanisms so that once we get the outputs from the computer vision, we will have more fine grain control of the robot’s motion. At this point, I believe most of the motion planning code for the robot has been written and tested.

Additionally, this week, I deviated a little from my usual tasks and took some time to help Ryan build the frame for the robot. We dimensioned the pieces to make the ramp and hold the motors in place. Then, using the saws in the woodshop in the Maker space, we cut the majority of the pieces. Now, we have all the pieces to build the robot frame and it just needs to be assembled next week.

Finally, my team and I spent a lot of time writing the design report detailing the specifics of the BallBot.

Next week, I will be working more closely with Ishaan in order to integrate the Computer vision with the robot. This might be a little challenging considering the computer vision was done in c++ and the iRobot’s motion were coded using python. I am looking into pybind and other features that can integrate c++ and python code in the meantime. This will also be the first time that we are combining 2 major components of this project and we expect that we will run into a lot of difficulties. However, if we can get this to work, it will be a major milestone in our process of building Ballbot.

Team Status Update for 10/10/2020

This week, we worked on many different individual parts of our project. We made progress on the hardware side by setting up a prototype tennis ball collector that allowed us to change the spacing of the motors that have the wheels that will collect the tennis balls. We were able to power the motors with a power supply from the Makerspace and control the motors using an Arduino. This allowed us to determine that our parts seem to be working and the wheel system will likely succeed in sucking up tennis balls.

We also continued to make progress on the software parts of our project. We further developed the computer vision algorithm to only detect objects that are the color of tennis balls so now any tennis balls in view of the camera will be marked with a target. Also, we were able to send commands to the iRobot Create 2 and control if to move and rotate. We still need to work on integrating these 2 pieces of software so that the computer vision code can also control the iRobot. This may be a challenge that we did not anticipate because the computer vision code is in C++ while the iRobot code is in Python. However, we have found that pybind is a library that could bridge these 2 languages.

Schedule Update

We are still moving according to the planned schedule and do not have any significant changes to the design of our project. One change we did want to make was to construct the arms of the robot that would gather the balls out of wood so that we can easily attach the motors to the arms and so that the arms would be sturdy. Our initial plan of using acrylic may not work out since it would be very difficult to cut and glue the acrylic together to make a sturdy arm. However, wood should not be expensive and should be relatively easy to obtain.

Ishaan’s Status Update for 10/10/2020

This week, I continued to work on the computer vision part of our project. After being able to interface with the Intel RealSense Depth Camera, I was able to convert the input into the HSV color space and select the range of colors for each channel that would allow us to find tennis balls. The algorithm now can locate multiple tennis balls in view of the camera and draw a target on each one to show what it is detecting. Currently, I am using my laptop to run the algorithm but will work on building the code on the Jetson Nano in the upcoming week.

Also, I helped Ryan with the hardware portion of the project and we created an initial prototype to test the motors and see if we could control them. We created a set up to find the best distance apart to keep the motors and were able to power the motors with the power supply from the Makerspace. We also programmed the Arduino to control the speed of both motors, but the motors were being very finicky at anything but their top speed.

We are not currently behind schedule and in the next week, I hope to further improve the algorithm to work better in different lighting conditions. Also, I want to work on moving the code over to the Jetson Nano where it will have to eventually run. In addition, I want to look into integrating the iRobot Create 2 API calls Rashmi is working on with the OpenCV code.