Team Status Report for 10/1

This week, the team primarily focused on getting the design presentation ready and ironing out all the software architecture. The computer vision and firmware side of the robot are well on their way and the motion-planning algorithm has been designed on paper and needs to be written up.

By the next status report, we hope to have at least one robot successfully accurately moving on the field and accurately able to pick up pallets using hard-coded goal-pose vectors. That target involves further work on the camera calibration, field construction, robot motion controller, and the robot-computer interface

Saral’s Status Report for 10/1

This week I worked on making the computer vision algorithm more robust to one or more pixels being blocked. This is quite useful since there is a chance that someone moving their hand over the playing field or an odd camera angle could possibly throw off the computer vision algorithm’s localization. Additionally, I worked on the Design presentation with the rest of the team and worked on solidifying some of our software architecture. I have also started work on building a visualizer for our demo. This visualizer will show exactly where the robots are in the field, and what paths they are following. This will be an invaluable tool for debugging system issues later in the project. Lastly, tomorrow (Sunday), Omkar and I will be working on the closed loop controller to make the robot motions more accurate

Omkar’s Status Report for 10/1

This week, I wrote the firmware for all of the different peripherals of the robots (Screen, LEDs, Servos, and Electromagnet). I also started the communication firmware that allows a computer to send POST requests to the robot. I brought up one robot in its entirety. The other robots are working, but some of the neopixel LEDs are not soldered properly, so only a few of the LEDs turn on for the other robots. I also worked on our design presentation slides, which I will present in the coming week. Our project is on schedule – we are planning on meeting tomorrow to determine the camera mounting and interface the robot firmware with the computer vision to have a better way of controlling the robots since we found out that the servos have a hard time driving straight – either because the servos are not exactly the same or because the PWM on the ESP8266 is driven by software interrupts. By next week, I aim to have a more robust communication framework in place and start on the controls software to get the robots to follow a straight line and hopefully a more complicated path.

Team Status Report for 9/24

In the near future, we envision the electromagnet not initially working and the camera lens being distorted at the edges being our most significant risks. We are planning on mitigating these risks by spending more time debugging the firmware and hardware on the MCU, the robot PCB, and the electromagnet and trying to understand why our electromagnet could not attract a paperclip. We found resources from manufacturers, namely Seeed and Keyestudio, about how to use their products with an Arduino, so we are confident that it should work soon. For the camera, we are going to have to calibrate the camera by moving a robot (with lit neopixel LEDs) in relation to a fixed camera and determining how the lengths between the centers of the LEDs change based on where the robot is. From there, we will have an idea of how the camera lens is curved at the edges compared to the center of view.

We are moving the robot field construction to the coming week, but everything else on the schedule should stay the same. As of now, we do not have any changes to the design or requirements, but we are reviewing and discussing the feedback that we got from the proposal presentation.

We got PCBs back this week and did an initial fit test: 

Then, we reflowed and assembled three robots:

We also got the screen working:

Omkar’s Status Report 09/24

I helped assemble the robots since the PCBs arrived in the middle of this week. Three robots were reflowed, and I started writing initial code to test that each of the individual components on the robots was working. I wrote code to test the servos, the neopixel LEDs, and the screen. I am still in the middle of debugging why we can’t turn on the electromagnet. Some of the robots are in different stages of bring-up. I think we should create a spreadsheet to track what hardware/firmware is working on which robot. We were unable to finish creating the field, but we discussed the plan for setting that up. Other than that, we seem to be on track for our schedule. We are going to build the field in the coming week. In the next week, I want to finish writing the firmware for all of the robots and ensure that all components are working on all robots.

Prithu’s Status Report 09/24

The first part of this week was dedicated to our proposal presentation – which I gave on Wednesday. Towards the middle of the week, our fabricated PCBs arrived and we began construction on the robots starting Wednesday night. We were able to reflow and solder 3 robot boards and attach all of the components (i.e. servos, wheels, electromagnet). We also designed and 3D printed a support for the front of the robot which is needed since the robot only has two wheels. We were able to get one robot fully working, but are still waiting to pick up the Neopixels (which I think arrived yesterday) to finish the other two. Our plan for this weekend is to start development on the firmware, CV, and motion planning code.

Saral’s Status Report 09/24

This week, I helped assemble the robots, and got most of them working! Assembling the robot took significantly longer than expected due to a couple parts being more finicky and us missing a few of the longer header pins we need. Currently all 3 robots are finished from a hardware perspective but we have a few issues with the electromagnet that we need to fix and robots 2,3 have some NeoPixel hardware issues. However, these are fairly minor things that we will get done in the next week.

 

Additionally, I also got the Computer Vision stack for localization started. I am successfully able to mask the NeoPixels, and also find their centroids. I am currently working on the inverse-pose transform code to enable our localization. The computer vision progress is going ahead of schedule and that leaves room to put in more work into the robot hardware issues!