Bhumika Kapur’s Status Report for 10/2

This week I continued the research I have been doing into the computer vision component of our project. After speaking with Tao this week, our team decided to use the Jetson Xavier NX rather than the Jetson Nano due to the fact that the Xavier NX has 8GB of memory compared to the 2GB of memory in the Jetson Nano. So, I researched cameras that would be compatible with this new component. I looked into the camera in the ECE inventory list that is specifically created for the Xavier NX, but, it has a very low FOV (only 75 degrees). So, I did more research into the sainsmart imx219, and it seems like this is compatible with the Xavier NX as well, and has a higher FOV (160 degrees), so we will most likely go ahead with this camera.

I also talked about depth sensing with Tao, and he suggested that we look into stereo cameras if we wanted a camera with depth sensing capabilities. Unfortunately, many of the stereo cameras compatible with the Xavier NX are quite expensive/sold out. So, I looked into the idea of using a distance sensor such as an ultrasonic sensor from adafruit along with a camera to properly navigate the robot.

This week I also looked into the April tag detection software, which is originally written in C and Java, but I found a python tutorial that I followed. I don’t have the software fully operating yet, but hope that it will be functional by next week.  I also worked on the design presentation with my team, and we discussed our progress and project updates.

Next week I plan to continue working on the April tag detection code, and try to setup the camera from the ECE inventory with the Jetson Xaiver NX.

Team Status Report for 10/2

This week each team member individually researched the tasks that we were assigned to. Ludi and Esther also worked on cadding the design for the chassis and the claw of our robot. We met with Tamal and Tao on Wednesday, and received feedback from our proposal presentation. A concept that we discussed in depth was using a vacuum grip system rather than the claw for retrieving the object. Esther did a lot of research into this idea, and the team decided that we will proceed with the claw as we are still very unsure about the vacuum grip design.

We also worked on the design presentation and used the feedback we received last week to update some of our requirements. We discussed our current requirements, such as the distance between objects, height of the shelf, and accuracy measurements, and updated those to reflect the needs that arise from our use case.

Ludi Cao’s Status Report for 09/25

This past week I did more research into the chassis design and motor requirements of the robot. I looked into the energy and cost tradeoff between various stepper / servo motors, and decided that the NEMA 17HS4401 bipolar stepper motors seem to be an efficient option. Given that our robot should not carry much weight (around 7-8 kg), the torque size of 40N should be sufficient as the motor accuracies should not be as important as computer vision detection and recognition. The proposal feedback mentioned the possibility of eliminating omnidirectional wheels in the design. The team still thinks that omnidirectional wheels in an easier option, since our robot has the horizontal movement of scanning through the shelf, as well as the vertical movement of reaching forward and backward to retrieve the item on the shelf. After some research, I found that most similar projects use two L298N motor drivers, where each connects to two motor drivers and is controlled by the Arduino board. Since the focus of our project is to accurately detect items on the shelf, I optimistically believe for now that if there is high latency between communication between the jetson nano and the Arduino, it would not be of great concern.

I am on schedule for this week. After discussion with my teammates about the research I did, I would order parts on Monday and start the CADing of the chassis. I would also begin to look into the battery requirements once the other significant components are finalized on our robot.

Bhumika Kapur’s Status Report for 9/25

This week I worked on the proposal presentation. I worked with my group members to complete the proposal slides, and we worked together to do some research into the components and algorithms that we are going to use to quantify some of our testing metrics and requirements.  I was presenting for my team this week so I spent a few days preparing for the presentation and getting familiar with the content.

I believe that I and the team are following our schedule well. Based on the Gantt chart that we created last week, our team is in the researching phase for the components/softwares that plan to use for our project. This past week I did some research into cameras for our project, starting with the cameras that are already in the ECE inventory. I found that some of the cameras in the inventory seem to be satisfactory for our project, such as the Sainsmart Imx, which has a 160 degree wide angle lens. But we may have to purchase a more expensive camera such as the Intel RealSense for depth sensing,  as I could not find any depth sensing information on the cameras in the inventory. I also did research into edge detection algorithms, and found that Canny edge detection seems to be the most effective (95%) and thus the one we should use for our project. We also tested out some images with canny edge detection and found that it ran quite fast, in only a few seconds. (see below). This week I plan to continue research cameras and algorithms, and start writing the code for some of the computer vision algorithms we plan to use in our project, if I have time.

Team Status Report for 9/25

This past week, we had project proposal presentations during class time. The team finalized the presentation slides on Sunday and Bhumika presented for the team on Wednesday. It was very interesting to hear other groups’ projects, especially as reference for fixing some parts of our proposal that we could have done better.

We also received some feedback and realized that we need to put more consideration into our requirements. The team mostly did separate research on their respective components this week and will reconvene sometime this upcoming week with our findings.

Esther Jang’s Status Report for 9/25

During class this week, I watched other groups’ project proposals and found them to be really interesting. I spent some time on Sunday to help finish up our presentation with the group.

One point that was brought up during the discussion after our presentation was why we were using a claw as our end effector rather than a vacuum gripper. I had previously given the vacuum-powered gripper some consideration but wanted to re-evaluate that choice just in case. In my research, I found that the most reasonable methods to do this given our use case was either using a squishy universal gripper or a suction gripper. However, the squishy universal gripper is better suited when pressing against a smaller object, which does not match our use case. The universal gripper typically molds around the object and holds the mold by vacuuming out the remaining air. The objects we are picking up will not have much normal force opposing the force we apply (since they are freestanding in a shelf) and will not be small, so the universal gripper will not work. The suction gripper will also require a very strong and expensive vacuum pump that is not practical for us to purchase. Typical off-the-shelf vacuum systems seem to only be able to carry 0.22 lbs, so it will not perform well enough.

After the above research, I decided to proceed with the servo-powered claw end effector idea. I have yet to solidify the linear actuation method but believe I will likely be using a DC motor-powered linear slide system. With this in mind, I wanted to sketch out the electronics required for my end effector and linear actuation systems. I chose a standard PWM servo and I2C servo driver (specifically PCA9685 in the diagram) for the claw. I chose a H Bridge motor driver and DC motor for the linear actuation system. Both of these choices were inspired by many other Jetson Nano projects I saw using a servo or DC motor, and I wanted to focus on feasibility before refining my choice of hardware.

However, I realized that the battery needs significantly more consideration than I had previously considered. This is because there are many ways to power the Jetson Nano which requires 5V/2A (10W) while the DC motors will require 12V and the servos require 5V (share the same power source as the Jetson Nano). We can either have different power sources for the motors and Jetson Nano or use a buck converter. I have been heavily evaluating the tradeoffs as the Jetson Nano seems to be sensitive about battery sources (i.e. read a lot of articles indicating that the USB power bank source should be avoided). For now, I left the power source for the Jetson Nano out of the diagram but have been focusing a lot of research in this area.

I believe I am currently on track with researching design considerations. After finishing up my research on electronic components, I will move onto CADing the mechanical components.

Ludi’s Status Report for 09/18

This week I discussed with team members on the feedback we received from the professor and TA. We worked on the presentation slides together, confirmed the division of labor, and wrote up the proposed schedule for designing and building the robot. I did some individual research on the hardware specifications and physical design of the wheelbase. I looked up relevant technical documents for the Jetson Nano board, Raspberry Pi board, Arduino boards, related electronics components, as well as the design report from past teams’ projects. I notice a potential issue of the wheelbase is that using omniwheels would be the most efficient way for the robot to move considering the vertical movement of reaching the object and the lateral movement of scanning through the items. However, only Arduino boards have enough PWM pins to connect to 2 motor drivers to control 4 wheels separately. There exist many successful robot projects with omniwheels controlled through an Arduino which is communicating with another computer board. I remain positive that the latency between a computer board and the Arduino would not be a significant issue. Since designing and building a wheel chassis is a relatively new experience to me, I also looked at various existing designs and posts on how to build wheel chassis from scratch. 

 

For the next week, I would work on the CAD of the wheelbase, and order the physical components after the presentation. I hope to complete the majority of the design next week. 

Bhumika Kapur’s Status Report for 9/18

This week the entire team worked on our proposal presentation. We used the feedback we received during our meetings with Tamal and Tao to reconsider some aspects of our project.

I also did some research on the computer vision algorithms that I will be working on this semester. I read some research articles that have been published on detecting a laser pointer,  as I am planning on writing a program that can detect a laser on a box. I also read about edge detection algorithms, as we will need to use edge detection in our project. Finally, I  looked into April Tags and cameras, as we are planning on using both in our project. I looked at the specs and reviews of different cameras, and looked at some April tag tutorials to get a sense of how we will use these components.

So far, I believe I am on schedule. We have just divided up the tasks within the group, and I have began working on my tasks. In the next week, I plan to finalize what edge and laser detection technique I will use, and select a camera for our robot.

Esther Jang’s Status Report for 9/18

This past week, I primarily worked on the project proposal submission that was due on Sunday and the presentation slides due tomorrow with the team. I will be working on the linear actuation system (movement along the z-axis) and the claw grabbing system of the robot. The feedback we received during our week’s meeting (as mentioned in the team update) did not change my side of the project too much, but I have been continuing to do research on these parts.

Currently, I am planning on using linear slides with a motorized pulley system for the z-axis actuation. I have some concerns about the stability of this system, especially given that we will have to take snapshots at the extended unstable heights. There are many standardized servo-actuated claw systems to use for the claw that we can purchase online, but I also may consider 3D printing one. Most of my research has been on these 2 components while keeping these considerations in mind.

My progress is on schedule but I think my next step is to start writing CAD and to start seriously considering what the design of the grabbing system will look like.

Team Status Report for 9/18

Early Sunday this past week, the team created the project abstract. We met with Professor Tamal and our TA Tao to discuss the abstract on Wednesday. During this meeting, we decided to add an element of the robot autonomously navigating to and from the shelf. We also learned about April tags and are planning to use this to make this navigation process easier. The other elements of our project will primarily stay the same aside from this change. We also have more considerations to make regarding implementation details and what hardware to use.

Aside from our project proposal discussion, we have been working on putting together our presentation slides that are due tomorrow.