Rashmi’s Status Update for 12/05/2020

These past 2 weeks have been highly productive for our team. We all spent a lot of time working together testing and improving the user interface of the BallBot. To improve the appearance and make BallBot look more like a product, I painted the sides of the robot as well as designed the top. The top of the robot now has a touch screen that can be used to interact with the robot. On top of the screen, we have a 3D logo that is big and visible. I designed the logo to be big such that it is visible from any distance and angle. It gives our product a unique and branded look.

In terms of integration testing, I helped my team bring the robot to the tennis courts to test. One issue we noticed when testing was that the wifi was unstable. This was causing the program to randomly shutoff due to a lost network connection. This was one of the primary reasons we decided to improve the user interface. With the screen, we programed it so that once the robot is powered on, the program can be started by clicking on the executable on the screen. This made it so that we no longer require Wifi. All our testing was surprisingly very successful. I did not need to make any changes to the routing algorithm.

This next week, I will be working on putting together the final video and blog post.

Here is what BallBot looks like now:

Team’s Status Update for 11/21/2020

We are on the final stretch for our project now and BallBot is really coming together. While this week was an extremely busy with each of our team members having several midterms, we still managed to make some progress on BallBot. This week we made subtle but important improvements to the software. Before, the Ballbot stopped moving whenever there were no balls in the field of view. Now, we improved the software so that the Ballbot first spins in a 360 degree circle when no balls are visible. If during this spin, it sees any tennis balls, the Ballbot stops spinning and goes after them. Otherwise, the Ballbot stops spinning until it sees more balls.

We also worked on the basket for the Ballbot and constructed it out of plastic. We used plastic so that the basket was lightweight and weatherproof. We also made the basket have a capacity of 30 balls so that it met our requirements. We are still designing a mechanism to attach the basket so that the basket is detachable.

Schedule Update

In the upcoming weeks, there are only a few things that remain for us to complete. We need to build a casing for the hardware components on the robot and do more testing to identify and improve upon the weaker aspects of the robot. We are fully on schedule and have pretty much finished the exterior of the robot.  One thing that we really need to do is test the fully functioning BallBot on an actual tennis court. We wanted to do that this week however, we didn’t get a chance to do so due to the weather.

Rashmi’s Status Report for 11/21/2020

This week I worked on making minor software improvements for the BallBot. We realized that if there is no ball immediately in the BallBot’s sight, we want to make the BallBot pan tennis court and look for tennis balls. In order to do this, we added a feature to the BallBot so that if there is no tennis ball in sight, the robot will spin in place and try to locate a ball. If it sees one, it will start going in the direction of the ball. If the BallBot does not see a ball, then it will rotate once in a full circle and then stop moving.

Additionally, this week I helped build a basket. Because the shape of the basket was highly specialized to the BallBot, we built it using polypropylene. This basket can fit 30 tennis balls and meets the user requirements.

Since this upcoming week is Thanksgiving and our team will be going out of town for the rest of the semester, I plan to work with my team in trying to test the BallBot on the outdoor tennis courts and figure out what other software improvements need to be made. Here is a picture of the basket that I helped make this week.

Team’s Status Update for 11/14/2020

This week, we spent a lot of time working together building the roof for the robot, attaching the hardware components, and painting the robot. Previously, we had the software running so that the Ballbot moved towards tennis balls and also had the tennis ball launching system be able to launch tennis balls towards the back of the robot. To combine these systems, we first created an acyrlic roof to go over the ramp in our launching system. We then attached all the hardware components including the Intel Realsence Camera, the Nvidia Jetson Nano, buck converter, motor controller, and battery to the top of the roof using M3 dual lock adhesive which works similar to Velcro. We also used a protoboard to solder the batteries output to both the motor controller and the buck converter which was attached to the Jetson Nano.

Once everything was attached, we first tested the motors to make sure they were working and then turned on the software to see how it would work with the tennis ball launching system. Fortunately, things went as planned and the Ballbot successfully chases down tennis balls within its field of view and launches them to the back of the robot.

After we tested the integration, we removed the arms and painted them to make the wood weatherproof which is needed for an outdoors device like the Ballbot.  We still need to make protection for the hardware on top of the robot.

Next week, we plan on improving the software so that the Ballbot does not just stop when it doesn’t see more balls, but first turns in a circle so that it does not miss balls outside its field of view. Also, we hope to test the Ballbot outside on a tennis court which will require fine-tuning our computer vision.

Schedule Update:

We are currently on track with our schedule in terms of hardware and software. The one task which is slightly behind schedule is building the basket to collect the balls because we thought the rest of the project was more of a priority. We hope to build the basket this next week so that we are completely on the planned schedule.

Rashmi’s Status Report 11/14/2020

This week, we performed the integration of the hardware and software components of the robot. We wired together all of the electrical components and attached it onto the robot. To attach it to the robot, we designed a roof for the robot and attached the roof to on top of the ramp. After attaching all the components onto the robot and fixing a position for the camera, we were finally ready to do our integration test. After much difficulty we were able to integrate the hardware with the software. Some of the issues that we ran into include the wifi taking up too much power from the robot. When we ran the program, the robot kept getting disconnected from the program because of wifi issues. To mitigate this, we found that running the program in the background ensured that even if the wifi gets disconnected, the BallBot will continue executing the program. We also ensured that the jetson would no longer need to be connected to a display in order for it to be booted and start running. After resolving these issues, we were able to successfully test our robot. See video below!

Some things I noticed when doing this integration test is that we need a way to detect or set boundaries for the robot. For example, the robot needs to know where the fence of the tennis courts is located so that it can turn accordingly. Ideally, we should have used the iRobot sensors for this. However, the ramp unfortunately covers any sensor that we could have used so we have to come up with another way to do this. I will be working on making these improvements to the software this week.

Additionally, in order to brand our robot and give it a makeover, I designed and drew a logo as well as painted the robot. The paint not only makes the robot look good, but due to the wood finish that we added on top, it will also protect the wood from water damage.

 

Rashmi’s Status Update for 11/07/2020

This week I spent a lot of time working with the team constructing the robot. Together, we spent a lot of time assembling the pieces that we cut last week onto the Roomba. We mounted the arms and the ramp to the mounting plate and placed it on the Roomba. Now the exterior of the robot is pretty much complete.

On the software side, I worked with Ishaan and we were able to move the robot using the Jetson Nano and the battery fully untethered. We simply taped the camera onto the front robot and tried to get it to follow a tennis ball. This worked pretty well. Additionally, I also helped set up the motors to be controlled by the power supply.

This week, I plan on creating a platform to place the camera on the robot as well as test the algorithm with static balls. This test will be performed both indoors as well as outdoors. Currently, we have not tested the robot with multiple static balls on the scene. We have only had one ball that we moved around to make sure that the robot is detecting it well. Depending on the performance from the testing, I will decide what the next steps for improvement are.

Rashmi’s Status Report for 10/31/2020

Much of this week was spent cutting wood again with my team because we weren’t able to assemble the pieces properly last week. Last week, we found it hard to assemble some of the wood pieces that surround the motor because we were screwing together wood pieces at 45-degree angles. This week, we recut them and decided to glue the pieces together instead. This process was much smoother and much easier than last week because we used glue. All of the pieces involving wood are now assembled. They just need to be placed onto the iRobot. We were not able to do that this week as we were waiting for the glue to dry. This week, we all plan on getting together and start to assemble the individual pieces onto the iRobot create2.

In terms of software, Ishaan and I were able to get the pyrealsense2 library to install onto the Jetson after great difficulty. We were able to run the code we wrote last week on the iRobot using the Jetson alone. The robot can now detect and move towards a tennis ball using the output from the real sense camera and the base path planning algorithm I wrote.

This week, since all of the software parts are now integrated onto the robot, I plan on focusing on improving the planning algorithm. Specifically, I will try to start using the depth information so that the robot can track and create a path for multiple balls at once.

 

Team’s Status Update for 10/24/2020

This week we made a lot of progress on the software aspects of the project. Rashmi and Ishaan worked on integrating the computer vision algorithm, the iRobot Create 2 code, and the Intel RealSense Depth camera’s code into one piece of software. When connecting the Real Sense camera and Create 2 robot to a laptop, we were able to successfully combine all 3 parts of the software. Depending on the location of the tennis ball in the Realsense camera’s field of view, the Create 2 robot would turn to try and cause the ball to be in the center of the camera’s field of view. This is what we want so that the ball will hit the center of the robot and be collected by the arms that we are working on building.

However, we still need to work on running this integrated software on the Jetson Nano which will ultimately control everything. When trying to install everything on the Jetson Nano, we ran into the issue of the Intel RealSense depth camera’s Python library not being able to be installed using pip on the Jetson Nano. This is because installing the library with pip wasn’t supported on the Jetson Nano’s architecture. This was an unplanned setback, but we plan on fixing it this week by building the Realsense library from source which should not be too difficult.

In terms of building the robot, we had a few setbacks this week. We as a team spent most of the day measuring and cutting the word so that we can assemble the robot. However, we had tried to make cuts in wood at 45 degree angles so that we could connect two pieces of wood at a 90 degree angle. However, we realized that drilling into and connecting 2 pieces of wood that were at an angle was very difficult, especially since our cuts weren’t perfect. To fix this, we decided to make flat cuts in the wood and just connect different pieces face to face which would eliminate the issue of imperfect cuts and make it much easier to connect the pieces of wood.

Schedule Update:

Currently, we are on schedule on the software side of the project, but slightly behind on the hardware part. To make up for this, we plan on spending more time in the Makerspace this coming week to get our wood cut and start connecting the pieces together to make the structure that will support the tennis ball collection system. Then we can also add the motor control software to our code and test if all the software and hardware is working together.

Ishaan’s Status Report for 10/24/2020

This week I spent some time with Rashmi working to integrate my OpenCV computer vision algorithm with her robot motion code. As suspected, integrating the c++ with python was harder than expected. We tried using pybind but for some reason that didn’t quite work the way we wanted it to. So, to save some time, we just converted the c++ code into python line by line. At first this felt like 2 steps back but one step forward but we eventually were able to get the python to behave exactly how the c++ did. The speed was not much slower either. We performed some tests to make sure that the code did not contain any bottlenecks.

After rewriting the code, we were able to integrate the output from the camera to the iRobot controls. The robot can now move towards a tennis ball if the tennis ball is in the field of view. Rashmi is going to work on improving this algorithm so that it can use the depth output from the camera and properly generate a route.

Additionally, I also helped Ryan assemble the robot on Friday. Although we got the pieces cut, the assembly was more complicated than expected. A lot of the cuts were at 45 degree angles and screwing them together was not easy. Although we did our best, we think that we should redesign the parts so that it will fit together better.

This week, I plan to spend some time building the python library on the Jetson Nano. This is currently not working because they don’t support pip and other necessary tools that I need to get this to work. Additionally, I might also need to go help Ryan cutting and building the robot since (due to the fact that we had a setback this weekend and the makerspace does not open until Monday) we might be slightly behind.

Rashmi’s Status Report for 10/24/2020

This week I took some time to work with Ishaan on the software aspects of the project as well as worked with the entire team on the construction of the robot.

I worked with the team on the hardware this Friday. I was able to finish cutting the parts for the robot. However, as we assembled it, we realized that some of the cuts were really difficult and hard to assemble. While we were able to piece some parts of the structure together, after a severed screw, we decided that it would be best to redesign these with simpler cuts.

In terms of progress with the software, Ishaan and I were unable to integrate the computer vision code written in c++ with the robot motion algorithm that was written in python. To resolve this, we spent some time rewriting the code from c++ to python. This was a fairly easy process as it mainly only involved converting the code and not changing the algorithm. Additionally, we discovered the pycreate2 library. This library has wrappers for the open interface for the iRobot create. We were able to convert some of the more complex motion commands that I had previously written to simple use wrappers from this library. One such example is the safe wrapper. A simple call to this function makes it so that the robot drives in safe mode. What this means is that the robot will automatically halt if it feels like its going to crash into a wall. This can come in super handy when the robot is near the net or the walls of the tennis court.

Ishaan and I were ultimately able to get the robot to move towards a tennis ball! We have a base algorithm working where if the tennis ball exists in the camera view, then the robot will move towards the ball successfully. If the position of the tennis ball changes in between, then the robot will update its direction and follow the tennis ball.

In terms of my work for this upcoming week, I plan on making the algorithm more robust and better. Some this I plan on looking into this week are:

  • If there is no ball in the camera view, the robot needs to turn and search for a ball
    • Need to identify a good interval to do this check at since we can’t have the robot spinning aimlessly.
  • Start using the depth feature of the camera so that the algorithm can work for multiple balls. The simplest algorithm that I can do is:
    • Use the coordinates of the object and choose to go towards one with the smallest Euclidean distance.
    • If there are too many balls on the screen, then do this process for maybe 5 balls and pick one and go towards it.
    • Will doing this operation be too slow? How can this be mitigated?
  • Test this algorithm somewhere that is not my living room 🙂