Rashmi’s Status Update for 12/05/2020

These past 2 weeks have been highly productive for our team. We all spent a lot of time working together testing and improving the user interface of the BallBot. To improve the appearance and make BallBot look more like a product, I painted the sides of the robot as well as designed the top. The top of the robot now has a touch screen that can be used to interact with the robot. On top of the screen, we have a 3D logo that is big and visible. I designed the logo to be big such that it is visible from any distance and angle. It gives our product a unique and branded look.

In terms of integration testing, I helped my team bring the robot to the tennis courts to test. One issue we noticed when testing was that the wifi was unstable. This was causing the program to randomly shutoff due to a lost network connection. This was one of the primary reasons we decided to improve the user interface. With the screen, we programed it so that once the robot is powered on, the program can be started by clicking on the executable on the screen. This made it so that we no longer require Wifi. All our testing was surprisingly very successful. I did not need to make any changes to the routing algorithm.

This next week, I will be working on putting together the final video and blog post.

Here is what BallBot looks like now:

Rashmi’s Status Report for 11/21/2020

This week I worked on making minor software improvements for the BallBot. We realized that if there is no ball immediately in the BallBot’s sight, we want to make the BallBot pan tennis court and look for tennis balls. In order to do this, we added a feature to the BallBot so that if there is no tennis ball in sight, the robot will spin in place and try to locate a ball. If it sees one, it will start going in the direction of the ball. If the BallBot does not see a ball, then it will rotate once in a full circle and then stop moving.

Additionally, this week I helped build a basket. Because the shape of the basket was highly specialized to the BallBot, we built it using polypropylene. This basket can fit 30 tennis balls and meets the user requirements.

Since this upcoming week is Thanksgiving and our team will be going out of town for the rest of the semester, I plan to work with my team in trying to test the BallBot on the outdoor tennis courts and figure out what other software improvements need to be made. Here is a picture of the basket that I helped make this week.

Rashmi’s Status Report 11/14/2020

This week, we performed the integration of the hardware and software components of the robot. We wired together all of the electrical components and attached it onto the robot. To attach it to the robot, we designed a roof for the robot and attached the roof to on top of the ramp. After attaching all the components onto the robot and fixing a position for the camera, we were finally ready to do our integration test. After much difficulty we were able to integrate the hardware with the software. Some of the issues that we ran into include the wifi taking up too much power from the robot. When we ran the program, the robot kept getting disconnected from the program because of wifi issues. To mitigate this, we found that running the program in the background ensured that even if the wifi gets disconnected, the BallBot will continue executing the program. We also ensured that the jetson would no longer need to be connected to a display in order for it to be booted and start running. After resolving these issues, we were able to successfully test our robot. See video below!

Some things I noticed when doing this integration test is that we need a way to detect or set boundaries for the robot. For example, the robot needs to know where the fence of the tennis courts is located so that it can turn accordingly. Ideally, we should have used the iRobot sensors for this. However, the ramp unfortunately covers any sensor that we could have used so we have to come up with another way to do this. I will be working on making these improvements to the software this week.

Additionally, in order to brand our robot and give it a makeover, I designed and drew a logo as well as painted the robot. The paint not only makes the robot look good, but due to the wood finish that we added on top, it will also protect the wood from water damage.

 

Rashmi’s Status Update for 11/07/2020

This week I spent a lot of time working with the team constructing the robot. Together, we spent a lot of time assembling the pieces that we cut last week onto the Roomba. We mounted the arms and the ramp to the mounting plate and placed it on the Roomba. Now the exterior of the robot is pretty much complete.

On the software side, I worked with Ishaan and we were able to move the robot using the Jetson Nano and the battery fully untethered. We simply taped the camera onto the front robot and tried to get it to follow a tennis ball. This worked pretty well. Additionally, I also helped set up the motors to be controlled by the power supply.

This week, I plan on creating a platform to place the camera on the robot as well as test the algorithm with static balls. This test will be performed both indoors as well as outdoors. Currently, we have not tested the robot with multiple static balls on the scene. We have only had one ball that we moved around to make sure that the robot is detecting it well. Depending on the performance from the testing, I will decide what the next steps for improvement are.

Rashmi’s Status Report for 10/31/2020

Much of this week was spent cutting wood again with my team because we weren’t able to assemble the pieces properly last week. Last week, we found it hard to assemble some of the wood pieces that surround the motor because we were screwing together wood pieces at 45-degree angles. This week, we recut them and decided to glue the pieces together instead. This process was much smoother and much easier than last week because we used glue. All of the pieces involving wood are now assembled. They just need to be placed onto the iRobot. We were not able to do that this week as we were waiting for the glue to dry. This week, we all plan on getting together and start to assemble the individual pieces onto the iRobot create2.

In terms of software, Ishaan and I were able to get the pyrealsense2 library to install onto the Jetson after great difficulty. We were able to run the code we wrote last week on the iRobot using the Jetson alone. The robot can now detect and move towards a tennis ball using the output from the real sense camera and the base path planning algorithm I wrote.

This week, since all of the software parts are now integrated onto the robot, I plan on focusing on improving the planning algorithm. Specifically, I will try to start using the depth information so that the robot can track and create a path for multiple balls at once.

 

Rashmi’s Status Report for 10/24/2020

This week I took some time to work with Ishaan on the software aspects of the project as well as worked with the entire team on the construction of the robot.

I worked with the team on the hardware this Friday. I was able to finish cutting the parts for the robot. However, as we assembled it, we realized that some of the cuts were really difficult and hard to assemble. While we were able to piece some parts of the structure together, after a severed screw, we decided that it would be best to redesign these with simpler cuts.

In terms of progress with the software, Ishaan and I were unable to integrate the computer vision code written in c++ with the robot motion algorithm that was written in python. To resolve this, we spent some time rewriting the code from c++ to python. This was a fairly easy process as it mainly only involved converting the code and not changing the algorithm. Additionally, we discovered the pycreate2 library. This library has wrappers for the open interface for the iRobot create. We were able to convert some of the more complex motion commands that I had previously written to simple use wrappers from this library. One such example is the safe wrapper. A simple call to this function makes it so that the robot drives in safe mode. What this means is that the robot will automatically halt if it feels like its going to crash into a wall. This can come in super handy when the robot is near the net or the walls of the tennis court.

Ishaan and I were ultimately able to get the robot to move towards a tennis ball! We have a base algorithm working where if the tennis ball exists in the camera view, then the robot will move towards the ball successfully. If the position of the tennis ball changes in between, then the robot will update its direction and follow the tennis ball.

In terms of my work for this upcoming week, I plan on making the algorithm more robust and better. Some this I plan on looking into this week are:

  • If there is no ball in the camera view, the robot needs to turn and search for a ball
    • Need to identify a good interval to do this check at since we can’t have the robot spinning aimlessly.
  • Start using the depth feature of the camera so that the algorithm can work for multiple balls. The simplest algorithm that I can do is:
    • Use the coordinates of the object and choose to go towards one with the smallest Euclidean distance.
    • If there are too many balls on the screen, then do this process for maybe 5 balls and pick one and go towards it.
    • Will doing this operation be too slow? How can this be mitigated?
  • Test this algorithm somewhere that is not my living room 🙂

Rashmi’s Status Update for 10/17/2020

This week, I spent some time fine tuning the iRobot moving mechanisms so that once we get the outputs from the computer vision, we will have more fine grain control of the robot’s motion. At this point, I believe most of the motion planning code for the robot has been written and tested.

Additionally, this week, I deviated a little from my usual tasks and took some time to help Ryan build the frame for the robot. We dimensioned the pieces to make the ramp and hold the motors in place. Then, using the saws in the woodshop in the Maker space, we cut the majority of the pieces. Now, we have all the pieces to build the robot frame and it just needs to be assembled next week.

Finally, my team and I spent a lot of time writing the design report detailing the specifics of the BallBot.

Next week, I will be working more closely with Ishaan in order to integrate the Computer vision with the robot. This might be a little challenging considering the computer vision was done in c++ and the iRobot’s motion were coded using python. I am looking into pybind and other features that can integrate c++ and python code in the meantime. This will also be the first time that we are combining 2 major components of this project and we expect that we will run into a lot of difficulties. However, if we can get this to work, it will be a major milestone in our process of building Ballbot.

Rashmi’s Status Report for 10/10/2020

As part of my individual work for this project, this week I spend some time applying the knowledge I had acquired last week on programming the iRobot create using opcodes. I worked on writing some code in python and TKinter that would use keyboard inputs to control the robot. This was challenging initially as I didn’t know how to use my laptop to interface with the robot. However, after figuring out how to establish a connection with the iRobot, using the open interface API [Open Interface Link] was not too difficult to figure out. The part that took the most time was figuring out how to change the velocity of the left and right wheels on the robot when the robot’s direction of movement was changing.

Now, using the keyboard, I can command the iRobot to go forward, backward, turn and beep! Running my code creates a popup that shows all the opcodes that are being run when using the hot keys on the keyboard. This feature will make it easier to debug code when interfacing with the RealSense camera and the Jetson Nano. Next week, I plan on using the Jetson Nano to interface with the iRobot instead of having the robot tethered to my laptop.

Rashmi’s Status Report for 10/3/2020

This week, most of my time was spent finalizing and ordering parts along with my teammates. While most of this process was straightforward, we did run into some issues with ordering the iRobot. Turns out that this is a popular item this year and was sold out pretty much everywhere. However, we were able to find a vendor and this issue was solved.

While waiting for the parts to arrive, I also spend some time learning about the iRobot create 2.

https://cdn-shop.adafruit.com/datasheets/create_2_Open_Interface_Spec.pdf

This link is for the iRobot programming guide I have been reading this past week. I leant about how to control the driving direction, speed, and turning angles of the robot. This link also talks about how to read in input from sensors. However, we do not plan to use this feature for in our final design.

Additionally, I also helped my teammates with the CV aspect of this project. It took a while to get OpenCV setup on my local machine and learn to use the OpenCV API for C++. However, we were able to create a basic program that could mask objects out within a specified color range.

The iRobot Arrived today!! So this week I will be testing out my newly acquired knowledge and try to get the robot to move around.