David’s Status Report for 4/27/2024

Accomplished Tasks

This week was the week of the Final Presentation. I was the one who presented this week, and the final presentation went well, mostly due to the preparation we did beforehand. I worked on putting the slides together, and made sure to rehearse my presentation for the big finale!

For content, I worked hard on ensuring that the rover would work when put altogether. This meant tuning the rover so that there could be accuracy during the laser pointing. At first I ran into issues where due to the latency of the CV, the rover was unable to converge on a correct point to aim the laser at; essentially it would keep overshooting the correction. Resolving this issue meant making the rover adjust very slowly so that it would converge and fire with great accuracy! It just happens to be perhaps overly slow, which is still under investigation.

For the presentation, we left it at “x-axis” accuracy, but I also worked on improving “y-axis” accuracy as well. This is similar to x-axis accuracy, except the tuning comes from turning the camera up and down instead. Tuning for this will need to be done still.

Progress

My progress is on track, with the full end-to-end rover put together, and very accurate! Now it comes down to tuning the rover to perform even more accurately by toying with vertical accuracy. This will also involve many tests and small adjustments, but with the main infrastructure there, it should be doable. There is also further consideration into the “search” behavior for the rover, since the inability to replicate movement is proving to make exact creeping line search not possible.

Next Week’s Deliverables

Next week, I plan to have the rover completely accurate, just in time for the ultimate final demo! I will also work on deciding a finalized search method. Lastly, I will also work on putting together all the final documentation pieces.

David’s Status Report for 4/20/2024

Accomplished Tasks

These weeks were the weeks of Carnival and putting things together. Over the last two weeks, I worked on putting all our separate components together, both in hardware and in software. I first worked with the team to put together the hardware; this meant designing the wiring schematic to figure out which wires would be connected where, and linking together the rover, the PTZ motor (and camera), and the laser. I added a breadboard to help unify the wires, and also labelled the length of the wires (to determine which should be longer than others. Fortunately, the hardware unification was a success! All the components worked after combining them all, which was amazing as now the whole rover was functional with all the pieces on it. Several wood structures were laser cut to help provide structure and stability to the rover’s components.

On the software end, I had to combine all the software code together. This comprised of unifying 4 disjoint files for the laser, moving, camera streaming, and PTZ moving into one big file. The moving code was also created to properly take in inputs from the CV side and points towards the targeted person as was initially planned. All the code unification worked as well, as the rover was now able to concurrently (using threads) stream, move, and point lasers, as well as read inputs from the CV side. Unfortunately, I have not resolved the issue of making the rover be able to do the same things twice, causing changes that still need to be done for the search pattern.

In regards to new tools and new knowledge, I had to learn a lot for this project from all sorts of places. I was unfamiliar with Raspberry Pi’s, UART communication, and all the sorts of libraries needed for the code. To learn these, I had to read an extensive amount of documentation online, along with observing examples of what people did on forums. Typically reading documentation provided the bulk of the understanding, while forum posts helped guide alongside. I also asked help from my TA Aden, who helped greatly in providing some invaluable guidance, advice, and assistance throughout the whole process.

Progress

My progress is on track, with the full end-to-end rover now put together. Now it comes down to tuning the rover to perform as accurately as we had planned for it to do. This will involve many tests and small adjustments, but with the main infrastructure there, it should be doable. There can also be some work in “pretty-ifying” the work as well.

Next Week’s Deliverables

Next week, I plan to have the rover fully functional *and* accurate, as well as present the Final Presentation. This means making sure that the rover is able to perform the given task of finding the person, and pointing at them with the laser with an acceptable accuracy. This is mostly fine-tuning, but is still critical nonetheless.

David’s Status Report for 4/6/2024

Accomplished Tasks

This week was the week of the Interim Demo. In regards to this, I managed to present the biggest success I had this project: getting the rover to move under my command through my own code! It involved a lot of hours of work, running into trouble after trouble. The inconsistency of documentation certainly did not help, since the JSON commands were unclear. However, even worse was the number of Raspberry Pi errors. My original Raspberry Pi had a broken TX pin that caused the JSON commands to be unable to be sent over UART communication; this was only debugged with TA Aden’s help (THANK YOU!). Another RPi I tried struggled to connect to the Internet despite visibly being connected. Fortunately, the last RPi I tried (the FIFTH one) managed to work, and I was able to control the rover through the RPi. As a reminder, this means that I am now able to control the rover through a computer connected to CMU wifi, effectively allowing for the system communication to all come together.

In order to verify the rover movement system works correctly, I have implemented benchmark testing to get a better understanding as to how much power needs to be given for the rover to move as directed (eg. how much power is needed to turn left). I also test whether or not the rover can drive straight. This involves testing the rover with predetermined simple paths (like a rectangle), and seeing if it can return to the original position, and facing in the right way.

See the rover move!

Progress

My progress is quite on track, though the end-to-end demonstration has not been put together yet. We had ideally wanted this to work a week or so ago, so all my focus will be on how to put everything together, including system communication and the infrastructure on the rover.

Next Week’s Deliverables

Next week, I plan to have the communication working out between the CV servers and the rover controlling system. This involves investigating the threading abilities of the RPi, along with figuring out how to translate the CV-detected person to directional commands. Working these out would finalize communication amongst the whole system, enabling everything to be put together.

Team Status Report for 3/30/2024

Significant Risks and Contingency Plans

The most significant risk to our project right now is getting the entire rover to work together. We managed to overcome our most significant concern last week by getting the PTZ camera feed to display. That said, it turned out to be a hardware concern, so we are debating if we want to purchase the same camera again, or to use a different camera which also functions. Using a new camera would involve trying to learn the library again (and it is unable to Zoom), but the camera can be displayed. As a result, we simply need to decide between the two cameras, with the contingency plan being the other camera. We also need to make sure everything can come altogether into one system, which can definitely have some unforeseen issues. Right now, we work on two different RPi’s, which would be the contingency plan if we cannot get everything to work on one RPI.

System Changes

As of right now, no major system changes from the last Team Status Report exist. Other than that, we still plan on continuing to implement the system with the original PTZ Camera motor. If we choose to switch to a different camera, that may be a hardware change. If we decide to order the same camera again, then there will be no difference.

Other Updates

Our schedule has been updated, and is reflected in our new Gantt Chart. Our progress has been a bit delayed, since the camera had been very troublesome, but now that that has been mostly resolved, the team can go back to focusing on their individual portions. We plan on having a preliminary end-to-end system working by the interim demo.

David’s Status Report for 3/30/2024

Accomplished Tasks

This week was a week of good progress… towards the latter half. I immediately started the week with burning my Raspberry Pi that I was working on. This was because while attempting to hook the RPi up with the rover, I did not realize that the rover shell was metallic. As a result, it shorted the RPi’s bottom, and burned the SD card (and myself). Fortunately, this was good insight for the future, as when we fully assemble the rover, I will remember to be more cautious about parts on the rover. I had to obtain a new RPi, and re-set it all up to be used again. I also obtained the new batteries, and they successfully fit inside the rover, allowing for the rover bottom to be closed appropriately. Slowly the rover is coming together. The JSON commands are still to be sorted out, as it took a while to find the correct port for the RPi to be read from.

On a more important point, working together with my team, we managed to get the camera to work! The camera can now be controlled with the PTZ motor, and the camera feed is displayed. It turned out to be a hardware issue all along, and we are investigating if we should switch to a different camera, but maintain the PTZ portion.

Progress

My progress is still somewhat on track, though I absolutely need to work on the rover JSON commands, which I fortunately can focus on now that the camera data is set up properly. I also need to discuss with how the CV servers will be sending the appropriate movement controls.

Next Week’s Deliverables

Next week, for the interim demo, I plan to have the rover be successfully controlled using programming; in other words, sort out the exact JSON commands needed to move the rover. I also plan on making the interfacing with instruction controls to be as straightforward as possible, so that when the CV part is linked up, it is as painless as possible. While this is still similar to last week’s goals, it mostly stems from being diverted to helping with the camera and being stalled by the RPI burning. The end-to-end version of the rover is a strong goal to work for towards the interim demo.

Team Status Report for 3/23/2024

Significant Risks and Contingency Plans

The most significant risk to our project right now continues to be getting the PTZ Camera to work. There had been extensive debugging performed, with little to no success. Upon obtaining a new Raspberry Pi and setting it up with a new, proper Bullseye OS, we achieved a huge goal: the PTZ camera movements were now controlled. However, the camera video feed data proves to not be working, showing errors of unable to find camera. This could potentially be a hardware issues, in which case there are plans to immediately order a new camera, or it could still continue to be a software issue. Contingency plans still involve getting a new camera, or possibly switching the camera out entirely.

We also made progress on the rover, and have managed to set up wireless communication with the RPi over wifi successfully. This means that the communication link between the rover and the main “Local Base” information computer has been established and works. This was a critical component needed for controlling the rover in our final implementation. The main goals now are ensuring that the hardware functions properly.

System Changes

As of right now, no major system changes from the last Team Status Report exist. Other than that, we still plan on continuing to implement the system with the original PTZ Camera. If the camera video feed data can still not be properly functioning soon, we may need to resort to finding new hardware.

Other Updates

There have been no schedule changes nor any other updates. Our progress has been a bit delayed, since the camera is proving to be very troublesome. As stated last week, we plan on having a full end-to-end system working by the end of this week, though that is mostly a stretch goal by now. The camera video data must be obtained soon, or we will end up behind schedule.

David’s Status Report for 3/23/2024

Accomplished Tasks

This week was the week of the Ethics Lecture. As such, I spent out first class discussing ethics with other teams regarding our and their projects. In regards to the project, the camera has been running into issues, causing concurrent work to be difficult. I set up an entirely new Raspberry Pi, going through the process of connecting it to wifi, and installing a (correct!) Bullseye OS. In terms of actual progress, I installed VNC Viewer onto the RPi, allowing for remote work to the RPi, making it much less of a hassle of needing to always have a spare monitor and keyboard with me. I set up SSH and successfully got scp commands to work from my own computer (the non-VNC Viewer desktop), meaning that there is a successful established connection with the RPi from our “Local Base” information computer in our final design.

It turns out that setting up the new RPi also made progress with the camera, allowing us to control the camera, but unfortunately the camera video data still has issues. That work is still to be done by my teammates. As of writing this post, I am still working on working out the correct JSON commands to send to the rover. Also, the new smaller batteries for the rover came in!

Progress

My progress is still somewhat on track, though dragged notably behind by the lack of being able to work concurrently on the project as my teammates. By setting up a whole new RPi, I have managed to set my work progress back on target. The camera video data is still a concern.

Next Week’s Deliverables

Next week, before our meeting with Prof. Kim, I plan to have the rover be successfully controlled using programming; in other words, sort out the exact JSON commands needed to move the rover. While this is similar to last week’s goals, it mostly stems from being diverted to helping with the camera and being stalled on my own side. The end-to-end version of the rover is still a stretch goal, due to the troubles with a camera, but still a very important deliverable to try and achieve.

David’s Status Report for 3/16/2024

Accomplished Tasks

This week was a week of intense progress. Early into the week, I realized that hacking the web application was not a suitable approach, due to the fact that the web app was hosted on the rover’s own wifi. As such, it would not be possible, or at least, rather inconvenient to try and mesh this rover wifi together with school wifi. Thus, I went with the backup plan, which was to communicate with the rover through the onboard Raspberry Pi, and then send the JSON commands through UART communication. This required reading Python documentation on how UART communication worked, and how to send the appropriate JSON commands. The code for this has been flushed out nearly entirely; the only main confusing part is the exact JSON command that should be serialized. The documentation is somewhat confusing and inconsistent, and trying to sort out exactly what is required is where I am currently at. I am also helping with my teammates in working out the camera, which has hit some difficulties in setting up. There appears to be issues involving the OS version of the RPi causing inconsistencies in downloaded packages.

As a side note, the rover works! It can be controlled with the web app, albeit the batteries we purchased were too large. I ordered smaller replacement batteries to resolve this issue.

Progress

My progress is now much more on track. Working out this control code, and having functioning hardware was a huge success. I am concerned about the camera’s status, which is falling behind, so I will devote efforts to making that work as well.

Next Week’s Deliverables

Next week, I plan to have the rover be successfully controlled using programming; in other words, sort out the exact JSON commands needed to move the rover. I also want to have an end-to-end version of the rover running, which, at this point in time, means getting the camera to work properly. This is highest priority, and I plan to help to make sure the camera can function. This could mean working out the bugs, installing a new OS, getting a new camera, etc.

David’s Status Report for 3/9/2024

Accomplished Tasks

These weeks were the weeks for the Design Review and Spring Break. A large chunk of my first week was spent on writing portions of the design review. I was primarily in charge of writing the Design Implementation portion, along with Trade Studies. This took a rather long time, as due to the recent change in our plans (eg. drone -> rover), I had to rework how exactly our rover project would work. The details of the plans are inside the report, but essentially speaking, our project now consists of a rover with a mounted camera/laser that will search and point as previously planned. The overall structure of our system now has an information hub, which will help with information control/parsing/propagation to the three other parts (the rover, the website, and the CV servers).

During Spring Break, I was mostly travelling. However, I also devoted what time I could into looking into the rover communication methods. Since this section of communication is the most critical to be able to have rover guidance, I looked into how the pre-made web application was able to send the JSON commands to the rover. I am currently working on interfacing and/or hacking this web application to be able to send the commands that we desire instead.

Progress

The original goal was to work out all the rover controls by this Wednesday. Prof. Kim had an even larger request of managing to have a full end-to-end product working by our meeting on Wednesday. Our rover, which we had ordered prior to Spring Break was set to arrive over Spring Break, but we have not had any notification of its arrival yet, which is concerning. Since I am in the process of working out the rover controlling right now, I am somewhat on track, though depending on how much progress I am able to accomplish over the coming few days, I may end up slightly behind schedule. Following Prof. Kim’s request would put us ahead of schedule.

Next Week’s Deliverables

Next week, I plan to have the rover be fully controllable using programming. The stretch goal is to have an end-to-end version of the final rover running by our meeting with Prof. Kim on Wednesday. For me, these goals should be essentially the same, as due to the modularity of our system design, the interactions between the three components should be relatively small. Rover movement may need to be packaged and encoded when received from the CV servers, but this should not be a small end portion; getting the rover to move in a general controlled manner is the main plan and goal.

David’s Status Report for 2/24/2024

Accomplished Tasks

This week was the week for the Design Presentation. I also had an official meeting with the robotics/drone people at their office in Squirrel Hill. While I had hoped to meet with Prof. Basti, he unfortunately could not make it, and I instead met with one of his students, Andrew Jong. The main takeaways from that meeting were:

  • Due to how expensive the drones are, Andrew doubts that Prof. Basti would let us simply take them for our own usage. That said, if we were to assist in his work in his fields, then that would be alright (albeit it’s more likely that Andrew himself flies the drone; we’re assisting him and his team with work).
  • Andrew proposes a very interesting set of topics to work with. Notably, there is a project on “wildfire drones”, involving drones being able to assist with wildfire rescue. The drone is designed to perform CV to find objects (potentially very difficult due to all the smoke), and also aims to plot what is seen on a map, so that we know 1) who/what needs rescuing, 2) how to escape, 3) movement of the fire over time, and more.
  • The focus he proposes is that currently their mapping only plots on a 2D surface, leading to inaccurate maps (since there is no accounting for topography). Being able to “search and 3D-annotate” the surroundings would be extremely useful for them.
  • In closer regards to our project, for further reach, we can also assist in developing our own perception network (the object detection portion) and sensor modules (the thing gathering the data for object detection). We could put these all together to have a functioning “drone” that we can demonstrate over simulation.

After reporting this discussion to Prof. Kim and Tamal, we had a meeting, to which we decided that we would turn down Andrew’s recommendation, and approach our original project with a new outlook. Rather than using a drone (which, while still viable at a high price, ran the risk of breaking and failing much more likely), we would use a “rover-esque” project. The core essence of the project would be the same; Search and Point would still be done, except this time, it would be by a rover running along the ground. Thus, a new set of research began on finding a rover, with the same capabilities as needed prior- having a software API, and able to carry some amount of weight. The benefit to this is that a rover carrying more weight is much more manageable! The main focus now is to find such a rover, and re-shape our project to work on a rover instead. (See below for more.)

Progress

With the pivot to a rover body, it puts me behind schedule. That said, if the rover does have the abilities to have a software API, this can potentially relieve a lot of issues for the future, allowing for reduced time then. To make up for the fact that we are behind on obtaining parts, with the help of TA Aden, I found the optimal solution may be the Waveshare WAVE ROVER, which contains nearly everything we are looking for (software API, GPIO pins, etc.). I would need to analyze how the WAVE ROVER controls work (as per my original task of “controlling the rover”), but determining our rover body to be this would bring us back on track.

Next Week’s Deliverables

Next week, I plan to have everything ordered and requested, particularly the drone. I plan to also analyze how ESP32 communication works with JSON commands, and generally speaking, how to control the WAVE ROVER. By our next meeting with Prof. Kim, we should have solidified how this project will be implemented moving forward (and get everything ordered/requested prior to spring break).