Team Report for 10/9

This week our team discussed and finalized the hardware components. We composed a bill of materials. With the Intel Realsense taking up almost half the budget cost, our current bill of materials would exceed the limit. We communicated this challenge with the professors, and appreciated greatly their understanding, as well as Tao’s kindness to lend his camera for us to experiment.

To satisfy the current draw requirements, we choose to use the BTS7960 DC Stepper Motor Drive https://www.amazon.com/BTS7960-Stepper-H-Bridge-Compatible-Raspberry/dp/B098X4SJR8/ref=sr_1_1?dchild=1&keywords=BTS7960%20DC&qid=1633569262&sr=8-1, since the motors draw an 8.5 A current at max. Different from the L298N spark fun motor drivers, only one motor can be connected to the motor driver instead of two. Hence, we would need six motors in total. We analyzed the pin layout of the Arduino and motor drivers, and realized that each motor driver would require 4 digital pins and 2 PWM pins that need to connect to the Arduino. Since each Arduino has 6 PWM pins as well as 14 digital pins, we would need at least 2 Arduino boards to connect to all of our components. Conveniently, the Jetson Xavier has 2 5V USB outputs, which can connect to 2 Arduino boards at max. We finalized our selection of the battery as well. Our design should meet the technical and physical requirements, and we are ready to compose our design report due next week. 

Bhumika Kapur’s Status Report for 10/9

This week I worked on a few tasks. First, I worked with my teammates to complete the design presentation. We had to reevaluate some of our requirements and solution approaches for the presentation. We also discussed our solution approach in detail and researched how each component would connect to other components.

I also received the Intel RealSense camera from Tao this week, and spent some time trying to set it up. I was able to get it connected to my laptop and using a VM, I am able to access the camera. The next step is to setup the camera in the Intel RealSense Viewer. I have been attempting to get the camera setup in the viewer, but the viewer crashes whenever I open it, so I will need to debug that this week.

I also worked on the April tag detection code. As I mentioned last week April tag code is originally in Java/C so I followed some tutorials online in Python, which I am most familiar with, to get the software setup. Currently I am able to draw a bounding box around the April tag, detect the specific family the tag is from, and output the exact coordinates of the tag in the image. My next steps are to combine this with the data that I would receive from the camera to output the exact location of the April tag so the robot can navigate to it. I hope to do that next week.

Example image/output:

 

Esther Jang’s Status Report for 10/2

This week, I spent most of my time doing CAD on Solidworks or researching design options.

I first made considerations for our linear actuation system. Given the height requirement of our robot, I thought it would be best to use a linear motion kit and decided to use the following pulley system described here: https://docs.revrobotics.com/15mm/linear-motion-kit. The primary motivation behind this selection was that it was built to sustain a multi-level cascading height that could reach our height requirement of 3 ft. Furthermore, it is very mounting-friendly for other parts (such as our claw or drivetrain) and price-accessible. This is better than other alternatives that are often significantly more expensive or cannot meet our height requirement (most common problem). I assembled the CAD for this part by using the standard part library from the vendor. The process was time-consuming as I had to relearn how to assemble with CAD, but I believe that it was valuable experience to help me with CADing the remainder of my part of the project.

Linear slide system CAD render (based on https://docs.revrobotics.com/15mm/linear-motion-kit/three-stage-cascading-lift)

I also invested several hours into checking the viability of using a vacuum suction gripper, following feedback from a recent meeting to consider it. Using a vacuum suction gripper was a good idea because of the benefit of reducing room for error and for emulating current industry solutions of our idea (i.e. https://www.youtube.com/watch?v=zeKfvUVbO3g&t=4s&ab_channel=IAMRobotics). The main issues I was facing in my research were the following:

  • Limited amount of information about the performance of diy suction gripper kits: Most vendors make claims about the performance (i.e. how much load the system can carry), so this was the only basis we could make a decision off of. I was unable to find much solid, unbiased research to support these numbers quantitatively aside from anecdotal videos of performance.
  • Limited availability of diy suction gripper kits: Almost all diys I could find were using some version of a standardized suction gripper kit that seems to only be sold from Chinese vendors and with an estimated shipping time of 1 month out in early Nov. There was also a single vendor on Robotshop that sold a potentially promising suction gripper system that uses a syringe+servo system instead of an air pump. The description claimed to be able to have a load of 3oz (0.21lbs) which could potentially be too poor of a performance.

Overall, the vacuum suction idea ended up being too high risk given its limited availability, so we decided to safely abandon the idea in favor for a claw.

Currently, I plan to use a claw system that is similar to off-the-shelf standard grippers (such as https://www.robotshop.com/en/actobotics-perpendicular-standard-gripper-kit-b.html) that is powered by a servo. The reasonably priced off-the-shelf parts do not seem to meet our requirements for objects to grab (small size maximum grip size of 4in and unknown (but likely very light) weights. My current research seems to indicate that CADing and laser cutting the claw is likely the correct direction, so I will continue to look in this direction (i.e. https://imgur.com/gallery/LpyW3).

I believe I am on track as the design research I did was thorough in making sure the chosen designs meet our requirements and are reasonably priced. I will be finalizing the claw idea and expanding upon the electronic hardware evaluation done last week by early tomorrow (in time for the design presentation).

Ludi Cao’s Status Report for 10/2

This week I finalized the main hardware structures for our robot. Esther and I agreed to use components from the Rev Robotics vendor. This is because the components are compatible, and we would not have to worry about booking inconsistent sizes or models from different suppliers. In addition, Rev Robotics has well-maintained documentation and CAD models for each part, which makes designing the chassis much more convenient. Since our robot would be 3 ft in height, I plan to design my chassis with roughly a size frame of 60cm x 60cm, along with 90mm diameter omni wheels. However, I haven’t done an extensive weight/force calculation yet to gauge the appropriate frame. I would invest in the details more throughout next week leading up to the design report, and this might likely change. Most of my progress this week has been prototyping / CADing the chassis frame. At first, I decided to use motors from other vendors, but realized that finding an adequate motor mount that fits other components in the Rev Robotics kit is very difficult. I also learned the importance of designing. While assembling the parts, I realized that the screw header from the motor mount would interfere with the wheels, due to the motor rod being too short for multiple components to fit on. This would be something I might not have thought of if I went straight into implementation, especially since these are the steps listed on the documentation. After choosing a new pair of mounts, and avoiding the usage of some pins, fortunately the issue is resolved.

 

Attached is the CAD for the wheel chassis so far. I haven’t included some of the screws yet, but the overall design is not affected. I am using the Jetson Nano here in place of the Jetson Xavier since I couldn’t find CAD models for the latter, but since the overall dimensions for the two are the same, this is used for reference. The back of the robot represents a counterweight that helps balance the robot, since the claw would extrude forward to reach an object. I haven’t decided on what weights to use yet, and this would be determined through calculations of the force relationships, as well as experiments. I still need to include the motor drives, the H bridges, and the connections to the motors for this prototype to be finished, which I would continue to work on for next week. I would start ordering components next week, and research more on the quantitative analysis of our system design, especially the force/torque, and power requirements/limits. This result would lead to our design report. If I find time, I would start my research in the programming aspect of motor control, or start building the chassis if parts come in early enough and the design feedback is positive, where not much needs to be changed

Bhumika Kapur’s Status Report for 10/2

This week I continued the research I have been doing into the computer vision component of our project. After speaking with Tao this week, our team decided to use the Jetson Xavier NX rather than the Jetson Nano due to the fact that the Xavier NX has 8GB of memory compared to the 2GB of memory in the Jetson Nano. So, I researched cameras that would be compatible with this new component. I looked into the camera in the ECE inventory list that is specifically created for the Xavier NX, but, it has a very low FOV (only 75 degrees). So, I did more research into the sainsmart imx219, and it seems like this is compatible with the Xavier NX as well, and has a higher FOV (160 degrees), so we will most likely go ahead with this camera.

I also talked about depth sensing with Tao, and he suggested that we look into stereo cameras if we wanted a camera with depth sensing capabilities. Unfortunately, many of the stereo cameras compatible with the Xavier NX are quite expensive/sold out. So, I looked into the idea of using a distance sensor such as an ultrasonic sensor from adafruit along with a camera to properly navigate the robot.

This week I also looked into the April tag detection software, which is originally written in C and Java, but I found a python tutorial that I followed. I don’t have the software fully operating yet, but hope that it will be functional by next week.  I also worked on the design presentation with my team, and we discussed our progress and project updates.

Next week I plan to continue working on the April tag detection code, and try to setup the camera from the ECE inventory with the Jetson Xaiver NX.

Team Status Report for 10/2

This week each team member individually researched the tasks that we were assigned to. Ludi and Esther also worked on cadding the design for the chassis and the claw of our robot. We met with Tamal and Tao on Wednesday, and received feedback from our proposal presentation. A concept that we discussed in depth was using a vacuum grip system rather than the claw for retrieving the object. Esther did a lot of research into this idea, and the team decided that we will proceed with the claw as we are still very unsure about the vacuum grip design.

We also worked on the design presentation and used the feedback we received last week to update some of our requirements. We discussed our current requirements, such as the distance between objects, height of the shelf, and accuracy measurements, and updated those to reflect the needs that arise from our use case.