Team Status Update for 10/17/2020

This week we made a lot of progress as a team. Although nothing huge happened, we each made progress in our areas that made the whole project progress. In terms of building the robot, we were able to acquire the wood and find some time to cut it. Now that we have the cut pieces, all we need to do is assemble the frame for the ramp and wheel mechanism. In terms of the robot motion, we now have a robot that can move on command in the direction that we want it to. The motion is very granular and we should be able to make the robot move in a targeted direction. Finally, in terms of the computer vision, we were able to test the algorithm outside and tune the parameters such that the ball detection in various lighting is no longer an issue.

As a team, we met up once this week to update each other on the progress that we had made individually and also work on the design report. We will continue to work on the design report this weekend and finish it up.

Schedule Update

We are now at the stage where we need to start integrating some of individual contributions to this project. The computer vision and the iRobot motion are at a point where they can be combined. We are still mostly on track with our planned schedule. We hope to pick up the pace on building the arms of the robot to ensure we don’t fall behind. One possible risk that could occur is that the entire ramp and wheel contraption doesn’t work in sucking up tennis balls. Hopefully we can get to the testing phase of that part of the project soon so that we can fine tune the ball pick up system.

Rashmi’s Status Update for 10/17/2020

This week, I spent some time fine tuning the iRobot moving mechanisms so that once we get the outputs from the computer vision, we will have more fine grain control of the robot’s motion. At this point, I believe most of the motion planning code for the robot has been written and tested.

Additionally, this week, I deviated a little from my usual tasks and took some time to help Ryan build the frame for the robot. We dimensioned the pieces to make the ramp and hold the motors in place. Then, using the saws in the woodshop in the Maker space, we cut the majority of the pieces. Now, we have all the pieces to build the robot frame and it just needs to be assembled next week.

Finally, my team and I spent a lot of time writing the design report detailing the specifics of the BallBot.

Next week, I will be working more closely with Ishaan in order to integrate the Computer vision with the robot. This might be a little challenging considering the computer vision was done in c++ and the iRobot’s motion were coded using python. I am looking into pybind and other features that can integrate c++ and python code in the meantime. This will also be the first time that we are combining 2 major components of this project and we expect that we will run into a lot of difficulties. However, if we can get this to work, it will be a major milestone in our process of building Ballbot.

Rashmi’s Status Report for 10/10/2020

As part of my individual work for this project, this week I spend some time applying the knowledge I had acquired last week on programming the iRobot create using opcodes. I worked on writing some code in python and TKinter that would use keyboard inputs to control the robot. This was challenging initially as I didn’t know how to use my laptop to interface with the robot. However, after figuring out how to establish a connection with the iRobot, using the open interface API [Open Interface Link] was not too difficult to figure out. The part that took the most time was figuring out how to change the velocity of the left and right wheels on the robot when the robot’s direction of movement was changing.

Now, using the keyboard, I can command the iRobot to go forward, backward, turn and beep! Running my code creates a popup that shows all the opcodes that are being run when using the hot keys on the keyboard. This feature will make it easier to debug code when interfacing with the RealSense camera and the Jetson Nano. Next week, I plan on using the Jetson Nano to interface with the iRobot instead of having the robot tethered to my laptop.

Rashmi’s Status Report for 10/3/2020

This week, most of my time was spent finalizing and ordering parts along with my teammates. While most of this process was straightforward, we did run into some issues with ordering the iRobot. Turns out that this is a popular item this year and was sold out pretty much everywhere. However, we were able to find a vendor and this issue was solved.

While waiting for the parts to arrive, I also spend some time learning about the iRobot create 2.

https://cdn-shop.adafruit.com/datasheets/create_2_Open_Interface_Spec.pdf

This link is for the iRobot programming guide I have been reading this past week. I leant about how to control the driving direction, speed, and turning angles of the robot. This link also talks about how to read in input from sensors. However, we do not plan to use this feature for in our final design.

Additionally, I also helped my teammates with the CV aspect of this project. It took a while to get OpenCV setup on my local machine and learn to use the OpenCV API for C++. However, we were able to create a basic program that could mask objects out within a specified color range.

The iRobot Arrived today!! So this week I will be testing out my newly acquired knowledge and try to get the robot to move around.

 

Team Status Update for 10/3/2020

This week, we received feedback about our project proposal. Our design, technical and user requirements were deemed to be very detailed and our project was declared feasible. Luckily, we did not have to make any changes to the existing design of the system. All the parts were ordered and should arrive in the next few days.

While waiting for parts, we worked on the computer vision aspects of this project this week. We started learning the OpenCV C++ interface and were able to create a basic algorithm that masked out pixels outside of any given color range in the HSV space. Also, we were able to connect to the Intel RealSense camera and read data from the camera to use in the computer vision algorithm. As we continue to build this algorithm in the coming weeks, we anticipate facing the issue of the computer vision having to work in different lighting conditions. While this was an anticipated issue, we did not realize the degree of sensitivity light had on the image. We saw that the same objects in different lighting had very different colors. We are hoping that in the upcoming week, we can test this algorithm in outdoor lighting to see if the light is just as sensitive. If it is, then we might have to tweak the color parameters in the algorithm to make sure that light will not have much of an effect on the detection of tennis balls.

Schedule Update

Even though the parts were ordered slightly later than anticipated, due to all the hard work from the entire team, we are right on track or even slightly ahead of the planned schedule. This gives us a little more slack time for the upcoming weeks in case our parts arrive later than expected.