Chris’ Status Report for 04/27/2024

What I did last week

Worked on kinematics and system infrastructure.

What I did this week

This week I gave our final presentation.  We spent time preparing the slides and prepping our project.  I spent time practicing the presentation and writing speaker notes.

I finished the rewrite of forward kinematics and I am very close to being done with inverse kinematics.  I took sometime to read through our HLS RRT kernel as Matt asked for some clarification during debug.  I noticed some discrepancies with the software implementation and we will be meeting tomorrow to debug and optimize the kernel.

We are on the final stretch and I am optimistic about completing the project in time for the demo.

What I plan to do next week

Finish system testing.  Prepare poster and demo video.

Chris’ Status Report for 04/20/2024

What I did last week

Worked on kinematics and set up testing environment.

What I did this week

My work this week was centered around finishing the kinematics module and installing Linux on an old laptop I have.

Due to some roadblocks, we have had to switch which FPGA we are using in our system.  Originally we were using the AMD Kria which has the ability to run ROS and has reasonably powerful embedded scalar cores.  We have transitioned to using the AMD Ultra96v2 which does not have as powerful scalar cores and does not have the same ROS functionality.  Therefore in order to make the transition we need additional scalar cores and have chosen to use a laptop for this purpose.

The laptop we are using is an x86 Mac which we needed to install Linux on in order to run ROS.  We had previously attempted to use a KVM but the hypervisor calls add too much overhead when we are handling perception data.  Installing Linux on Macs can be non-trivial as Apple includes a T2 security chip which will prevent naive attempts at installing another OS.  Luckily there exists open source Linux distros that have been tweaked to circumvent this.  After some debug related to the WIFI and Bluetooth drivers, I was able to install Linux and the packages we need for our system onto the laptop.  This laptop will be the center of our system integration in the following week.

This week I also continued work on kinematics.  This involved a partial rewrite of the forward kinematics module in order to use a more robust version of rotation and translation matrices.  The inverse kinematics problem is solved analytically by computing angles for 2 link systems and then computing the difference in angles to find the angle of the third link.  This module will be calibrated beginning tomorrow.

What I plan to do next week

System integration, calibration, and debug.

Learning Reflection

Before this project I knew some basic principles of robotics but not to the level of depth that I do now.  In order to learn about this I have read many academic papers on robotics in general as well as in more specific areas like motion planning and kinematics.  In order to implement RRT and A* I was able to use my learning from the research papers as well as pseudocode that is publicly available on Wikipedia.

I have also found Wikipedia to be an extremely helpful tool for reviewing the mathematical and geometric equations that are used in kinematics.  Luckily I am currently taking a course on Discrete Differential Geometry so my general knowledge is up to date.  Textbooks and literature from this course have proved to be invaluable general knowledge and significantly reduced the learning curve.

Kinematics is the area in which I have learned the most.  This was necessary as I have implemented a custom forward kinematics solver, analytic inverse kinematics solver, and a graphical simulation environment.  In order to do so I watched a series of lectures at MIT as well as reviewed the available course notes.  I also found the lecture notes from a robotics class at CMU and this robotics textbook helpful.

Finally I was able to review the lecture notes and other resources from 18-643 in order to get up to speed on FPGAs.

Chris’ Status Report for 04/06/2024

What I did last week

Wrote script to control and communicate with robotic arm

Worked on kinematics

What I did this week

This week I implemented forward kinematics for the arm as well as implemented a simulator that allows us to visualize the arm in the 3D plane.  This involves calculating rotation matrices and a lot of hand drawings and geometry to ensure correctness.

I went to Home Depot and bought two 1/4” thick MDF boards.  There boards are about 2′ x 4′ and we were able to use some wood clamps I had to hold them together, creating a 4′ x 4′ state space.  We mounted the robotic arm in the middle and are done with our baseline setup.

Using this setup, I was able to correlate the arm in the simulator with the arm actual arm.  This allowed me to verify that axis, angles, and measurements int he simulator align with what we have in reality.

Next steps include transitioning from forward kinematics to inverse kinematics.  I believe this transition should be relatively smooth because I have already built the simulator and ensured its correctness.  I have already done a lot of the math for the inverse kinematics on paper already as well.

What I plan to do next week

Complete inverse kinematics.

Matt has taken over SMA* but run into some bottlenecks.  When I am done with kinematics I will transition to code optimization.

Testing and Validation

The majority of the work I have done so far has been with regards to the software implementation of our algorithms, the robotic arm control, and the kinematics.

The RRT and A* algorithms are implemented in C but I have created python bindings for them that make it significantly easier to run and test.  Because of this we have been able to run these algorithms on different state spaces and check their ability to converge and to find a correct path.  Now that we have have created our actual setup we have decided on a state space that is 128 * 128 * 128 voxels (~2 million total).  Further testing has showed we must optimize the software implementation more if we want to converge on paths in a reasonable time.  The number of iterations needed for RRT to converge increases as we increase the size of the state space.  With a 64 * 64 * 64 voxel state space we achieved paths in 10,000 iterations but in the 128 * 128 * 128 state space it requires 100,000 (note that each iteration takes two steps and that not every iteration is successful in adding a node to the RRT graph).  This first example completes in roughly 4 minutes while the second example takes about 20 minutes.  We must increase the speed of a single iteration significantly.

Testing with regards to the arm controller and the kinematics has been done via a graphical simulator I developed.  This simulator allows us to compare the supposed angles of the arm in the state space with the positions they are in in actuality.  This testing was completed before the interm demo and was done to ensure the correctness of our implementation.  Things appear to be working for now and there are minimal concerns with regards to the speed of these operations.

Team’s Status Report for 03/30/2024

This week was spent working on individual modules of our system.  Time was spent debugging the Kria setup and porting our RRT implementation to the Ultra96.  Progress has been made with regards to the perception system and we should be able to begin working with real perception data in the next week.  The inverse kinematics module is being calibrated and integrated with the arm controller and the perception simulator.

This coming week we have the interm demo.  This means we will need to set up our test area.  We plan on doing so this weekend and on Monday.  We should be able to demonstrate significant portions of our modules although we might not be able to finish integration on time.

Chris’ Status Report for 03/30/2024

What I did last week

Watched and read lectures on inverse kinematics

Did initial calculations for inverse kinematics implementation

Wrote a script to communicate with Arduino and send it commands over UART to control the arm

What I did this week

This week I worked on getting inverse kinematics working for the arm.  In the process of doing this I ran into some trouble because the arm we are working with has limited capabilities which makes it incompatible with many existing tools.  In order to handle this I have decided to implement the inverse kinematics module myself.  This will allow us significantly more control over our set up and should increase our ease of use in the long term.

Our robotic arm has 6 servo motors.  One of these is used to open and close the gripper and another is used to rotate the gripper.  These degrees of freedom are not relevant to how the arm is actually moved from position A to position B in the state space.  This leaves 4 servos that are responsible for the movement.  One of these servos is located at the base and causes the arm to rotate around the Z axis.  This rotation around the base means it is trivial to align the arm into the 2D plane of the point we want to reach.  This reduces the dimensional of the problem from 3D to 2D.  From here we can use the Rule of Cosines to find the angles of the final three servo motors.

Implementing this has been my main focus over the past week.  Doing this ourselves will give us significantly more control over the state space and will allow us to easily interface with the RRT accelerator.

What I plan to do next week

My work on this inverse kinematics module and its calibration will be continued in this next week.

SMA* is still in progress

Team’s Status Report for 03/23/2024

This week we had the Ethics Lecture during class on Monday.  On Wednesday we met as a team during class time.  This past week has mainly been spent troubleshooting tasks in parallel. Yufei has been working on the perception system and was able to calibrate the camera sensor.  This means we are able to get the point cloud data we need from the camera and he is now working on passing that data to octomap.  The end goal is to do some preprocessing of the data in octomap and then pass it to our RRT implementation.  Matt has been working on setting up the AMD Kria and has been in communication with an engineer from AMD.  Chris’ has made progress on the inverse kinematics module and will be testing it starting next week.

Our goal is to complete our tasks within the next week and to begin integration as soon as possible.  This integration task will likely take a significant amount of troubleshooting and calibration.  That being said we are close to having a baseline implementation of our full system completed and are still on track for the Interm demo.

Chris’ Status Report for 03/23/2024

What I did last week

Wrote dense matrix A*

What I did this week

This week I worked on inverse kinematics and arm control.  I was previously unfamiliar with the intricacies of inverse kinematics so the majority of my time this week was spent on getting up to speed.  I found a course at MIT on Robotic Manipulation and watched a significant number of the lectures.  The course notes are also available which helped me do some initial calculations on how convert the path we generated into servo motor angles.  These calculations involve initializing the arm into a known state and then creating rotation matrices that correspond to the steps in the path.

I also wrote a python script which allows us to communicate with the Arduino that controls the robotic arm.  This script in conjunction with some code I wrote for the Arduino should allow us to send the servo motor angles we generate during inverse kinematics from my laptop to the arm.

I plan on testing these this script and the calculations I have done in class on Monday.  I anticipate there being a good amount of calibration and troubleshooting.

As we begin to work on porting our code to the FPGA, I have started to look at writing a SMA* implementation.  This is a memory bounded version of A* that would be necessary on the FPGA due to its inability to dynamically allocate memory.

What I plan to do next week

Test and calibrate inverse kinematics and arm control.

Implement SMA*

Chris’ Status Report for 03/16/2024

What I did last week

Wrote dense matrix RRT implementation.

What I did this week

This week I wrote the dense matrix A* implementation and its python bindings.  This completes our baseline implementation of motion planning.  We are currently capable of generating simulated perception data, running RRT to search the state space for viable paths, and then run A* to find and return the shortest path.

The next step I will be working on is the inverse kinematics module that takes the path returned by A* and generates control signals for the robotic arm.  I have already found an open source implementation and have begun to run some tests.  I do not have a complete understanding of inverse kinematics and have therefore found some resources to help strengthen my knowledge.  The inverse kinematics module is currently under development and I am targeting its completion by the end of next week.

What I plan to do next week

Work on inverse kinematics module

Setup serial communication to send commands to the arm

Chris’ Status Report for 03/09/2024

What I did last week

Built and tested the robotic arm

Worked on RRT octree implementation

What I did this week

This week I finished the RRT implementation on the octree.  Upon finishing this implementation and further discussions with my teammates, we decided that a dense matrix implementation would be ideal for the FPGA.  For this reason I developed functions to compress the octree into a dense matrix.  I then implemented RRT on the dense matrix.  These backend implementations are abstracted away in the perception simulator.  Swapping out the octree and dense matrix backends is as simple as toggling the “COMPRESSED” macro in the header file.

While the current RRT implementations are operational, they are not optimized.  My current focus has been on getting a baseline implementation working.  I have added some notes to our code base and have developed some ideas on future optimizations.  I believe that the largest speedup will be found in optimizing search for the nearest neighbor to a specific voxel.  This computation is highly parallelizable and there are some heuristics about the order in which the state space is traversed that should be followed.

I have begun implementing A* for the dense matrix backend and  I am targeting completing it by the end of the weekend.  When I finish A*, the baseline implementation of our motion planning module will be complete.  From here I will transition to work on inverse-kinematics and arm control modules.

What I plan to do next week

Finish implementing A* on dense matrix

Continue work on inverse kinematics and arm control

Chris’ Status Report for 02/24/2024

What I did last week

Last week I focused on designing the backend inverse-kinematics system.

What I did this week

The robotic arm came this week and we met to build it.

The arm has 6 servo motors and is controlled by an Arduino through a motor driver shield.  I did some reading on the API that controls the arm and was able to run a test script that moved it around.  The plan is to have paths generated by the RRT module converted into control signals by the inverse kinematics module.  These paths will then be communicated to the Arduino over UART.

Now that we have access to the arm, I can begin to implement and test the inverse-kinematics system and arm control.  We had some discussions on how to set up our test scene and are planning on building a shelf.  This can be used to test our systems ability to reach into confined spaces.  We believe this will be a good initial test of our system in real world scenarios.

I continued work on the perception simulator and the baseline RRT implementation.

What I plan to do next week

Finish the initial baseline RRT implementation.

Finish the initial perception simulator.

Continue work on inverse kinematics and arm control.