Tio’s Status Report for 11/2/2024

WORK ACCOMPLISHED:

This week, I worked with Jason on the VESC, Motors and Raspberry Pi.  We spent quite a bit of time debugging the connections between the Pi and VESC. Eventually we abandoned using the GPIO pins in favor of using USB connections. We will be writing the scripting for the motors using a conjunction of the PyVESC and serial python modules. In similar fashion, we plan to forego the Qwiic connectors and use a USB cable, UBlox_GPS, and serial to manage control the GPS.

We’ve achieved individual control over the motors, but we’re still trying to execute instructions simultaneously. I plan to use Python’s multithreading features to accomplish this.

The LiDAR is not yet fully operational on the Pi. Due to its Linux environment, we have to build the SDK from its source files rather than just installing it.

PROGRESS:

I am on track with my tasks for the week, which focused on defining a Python Class for the motors entailing all the methods we will be using to control the motors. We initially planed to have return completed by this week but that was an ambitious deadline that was set to leave extra cushion time towards the end. We can slightly relax our expectations, especially because the motor commands are taking much less time than anticipated.

In experimenting with the LiDAR I’ve found that its depth camera performs very poorly outdoors. It’s RGB camera is fine however, and already capable of identifying faces and outlining them in a red bounding box. This gave me an idea to propose relaxing our vision requirements to the recognition of a single object (e.g. a traffic cone) as a proof of concept. This is fairly easy to accomplish using opencv and a public data set for recognition.

NEXT WEEK’S DELIVERABLES

  • Rewrite GPS data collection on the Raspberry Pi to use USB serial ports.
  • Complete the migration of LiDAR code to the Raspberry Pi environment to ensure seamless operation
  • Begin testing the return algorithms once the board is fully assembled, using LiDAR data to initiate brakes.

Tio’s Status Report for 10/26/2024

WORK ACCOMPLISHED:

This week, I focused on getting the Intel RealSense L515 LiDAR operational. Due to the product being discontinued, I faced challenges with outdated documentation. However, I was able to resolve these issues by installing RealSense SDK v2.50 and pyrealsense module v2.54, which are the latest versions compatible with the L515.

I also wrote a Python program to detect obstructions by raising a signal when a configurable percentage of the LiDAR’s view is blocked within a certain distance. These parameters will be fine-tuned during integration testing to ensure reliable object detection. In addition, I assisted Jason with soldering the battery, VESC, and power switch connections, contributing to the hardware assembly process.

PROGRESS:

I am on track with my tasks for the week, which focused on object detection using the LiDAR. However, I still need to migrate the LiDAR code to the Raspberry Pi environment, which will be a priority moving forward.

I haven’t been able to complete the motor functions for the board, because there was a slight delay in the arrival of one of our parts. The board has not been fully assembled yet.

NEXT WEEK’S DELIVERABLES

  • Set up GPS data collection on the Raspberry Pi.
  • Begin testing the return algorithms once the board is fully assembled, integrating LiDAR data for autonomous recall.
  • Complete the migration of LiDAR code to the Raspberry Pi environment to ensure seamless operation.

Tio’s Status Report for 10/19/2024

WORK ACCOMPLISHED:

The initial parts required for SkateBack have been procured. Before the break, I picked up key components we ordered from the course inventory, including the Intel RealSense LiDAR Camera L515, Raspberry Pi 4, and the SparkFun GPS-RTK Dead Reckoning Breakout (ZED-F9R).

I began the initial setup for the Raspberry Pi 4. Additionally, I tested the LiDAR sensor on my desktop computer and installed the RealSense software on a Windows device to familiarize myself with its operation. I also explored how to run the Intel-provided SDK in a Linux environment and reviewed sample Python code to understand how to interface the LiDAR with the Raspberry Pi effectively.

Unfortunately, I was unable to test the GPS module as the required antenna and connectors to interface it with the Raspberry Pi or other computers are still pending. However, this hasn’t affected progress, as I have continued studying GPS operation and preparing code frameworks to interface with it once the missing parts arrive.

Additionally, we finalized most of our system design in the design report, which now serves as a critical reference document moving forward. The decisions documented there will guide component integration and help align all team members on the project’s direction.

PROGRESS:

I am currently on track with my tasks for the week, focusing primarily on setting up the Raspberry Pi environment and experimenting with sensor data collection. The initial tests with the LiDAR have provided insights into its data streams and will inform the next phase of development.

Moreover, I received notification that additional parts from our order have been delivered. This will allow us to begin assembling the physical skateboard and integrating components next week. With more hardware in hand, the team can shift attention toward building the board and testing individual subsystems.

NEXT WEEK’S DELIVERABLES:

My deliverables for next week are as follows:

  • Test data collection from the sensors (GPS and LiDAR)
  • Define a library of motor functions for the motors
  • Work on sending object presence to the Raspberry Pi
  • Work on the back up algorithm for obstacle avoidance

Tio’s Status Report for 10/5/2024

WORK ACCOMPLISHED:

Part Changes: Accuracy Is Improving and Costs are Going Down –The two major milestones for this week were the design presentation and filling out the ordering forms for our parts. I gave the design presentation on 10/2 and it was well received. We also had our parts finalized by then so we filled out the ordering form after class. A few changes have been made to our design. The course staff have elected to buy an RTK-GPS  breakout for the course inventory, and we will be using it rather than buying one with our own budget. The model they chose was one that we initially considered but was disregarded because at $289 it would’ve taken up close to half of our budget. The SparkFun GPS-RTK Dead Reckoning Breakout – ZED-F9R that we will now be using boasts far superior precision (0.01m horizontal accuracy with a correctional stream from a base station, one of which is located atop Hamerschlag Hall). Similarly, the LIDAR we will be using also has changed. Joshna picked out the Intel RealSense LiDAR Camera L515 from the course inventory which we had not previously considered. With these two part changes, we’re saving about $140, which solves our previous concerns of going over budget. One replacement that hasn’t changed our budget was transitioning from an RPI 5 8GB to an RPI 4 8GB. This was done because of the reduced power consumption demands of an RPI 4, and our decision to power the RPI and its accessories with a rechargeable power bank over USB-C. I think we can expect to begin receiving parts next week, which might allow us to get a bit of assembly work in before Fall Break.

The UTM Coordinate System – Another promising development this week was regarding the coordinate system I mentioned in my last status update. My idea to project longitude and latitude onto a flat x-y grid like system already exists in the form of Universal Transverse Mercator (UTM) coordinates. UTM works by dividing the world into a grid, where each grid cell has a unique identifier. A UTM coordinate consists of the grid cell identifier and the “easting” and “northing” within that grid cell. The easting (x) and northing (y) are in meters, and are relative to the southwest corner of the grid cell. For example, Pittsburgh is in UTM Zone 17T, and Doherty Hall (where I’m currently typing this report has northing: 4477403.01 and easting: 589501.46). After talking to a friend in CIA that works on the buggy’s RTK kits, I found a Python library that CMU Roboclub uses to convert GPS readings to UTM and vice versa. This is a big find, as I was worried that implementing our own grid system and conversions would be fraught with error.

Tracking the Rider via their Phone – This week I explored the different alternatives for determining the location of the rider that the skateboard the user will attempt to find its way back to. So far I’ve thought of two approaches:

  • Using a Wi-Fi or cellular positioning via a geolocation API: When the user decides to start the “return to me” process, the webapp makes a request for the host device’s position using a geolocation API. There are a wide array of APIs available, varying in both precision and pricing, but I doubt any are as accurate as readings from a GPS chip which brings me to our next option.
  • Leveraging the single GPS chip we already have:  The basic idea is that the rider and skateboard share the same location whenever the rider is on the board, so if we could detect a separation event (like the rider falling of) by measuring an acceleration spike using the IMU embedded in the GPS breakout (similar to how HDDs can detect when they’ve been dropped), then we would use the GPS chips last read before separation as its return destination. This has its own limitations though. The user would have to stay where they fell and start the skateboard callback.  And it would be limited by the accuracy of separation detection. Also what if you wanted to simply come back to you from somewhere you placed it?

Ultimately I imagine some combination of these two approaches will be best.

PROGRESS:

We are back on schedule now that we’ve order parts, and I expect we will begin receiving parts next week. We’ll be able to do some early unit assembly and testing once we receive the RPi,  LiDAR and GPS. Even without those parts, I am currently working on the software that will go on the RPi such as location data parsing and collation.

NEXT WEEK’S DELIVERABLES:

I main target to hit next week is to begin work on assembling the board. This is ultimately dependent on when our parts get here, but any progress we make before fall break will be a huge plus. After fall break we need to hit the ground running if we’re hoping to meet the rest our deadlines.

I am a bit concerned about a few design elements that are still up in the air. Working on the design report due next week will help us to finalize these choices and have a single reference point during all the actual implementation that will follow after fall break.

 

Tio’s Status Report for 9/28/24

WORK ACCOMPLISHED:

Parts List: This week, I continued finalizing design choices for our project. One promising find this week was a SparkFun GPS Breakout that comes with an integrated IMU (SparkFun GPS Dead Reckoning Breakout – NEO-M8U (Qwiic)). This breakout would help us save money on parts,  because we would no longer need to purchase a separate IMU, or extra cables. Additionally, this chip supports u-blox’s Untethered Dead Reckoning (UDR) technology which integrates IMU readings to improve GPS accuracy, meaning that we might no longer have to perform these calculations ourselves. This breakout would need an external U.FL antenna. I’ve decided on this model both because its price, and because it comes with an adhesive which would be convenient to stick to the underside of the board.

Also, after an insightful meeting with Prof. Bain and Joshna, I learned a possible solution to the issue of the Pi HAT taking up all the GPIO pins. I found a GPIO expansion/splitter that would allow us put on the Pi HAT, while still having other pins (such as the PWM pins we’ll need for the VESC) exposed.

In this same meeting I proposed my idea for navigation which involved projecting longitude and latitudinal co-ordinates onto an x-y grid of a finite square area. It was met with positive feedback which gave me some confidence about our approach. We also decided that we would draw power from

PROGRESS:

I’m slightly behind schedule, but with good reason. We had planned to have a parts list finalized at this point, but the recent finds are worth the slight delay. Thankfully, the RPi we plan to use is available from the course inventory so I expect that we will be able to get our hands on one fairly soon. The other parts  needed for GPS tracking (the antenna, GPS breakout and Qwiic HAT) are also available with quick delivery.

NEXT WEEK’S DELIVERABLES:

Next week, after acquiring the RPi and GPS, I plan to write the basics of the tracking code such as establishing its location start up, and determining the distance between two points. I’ll also work with Sharon so we’re able to share this data with the webapp.

Tio’s Status Report for 9/21/24

WORK ACCOMPLISHED:

Parts List: This week I focused on creating an extensive document that details our choices for hardware components, especially the Raspberry Pi, GPS and IMU. I’ve decided on a model for the RPi and I’ve narrowed it down to two models for the GPS and IMU each.

My decision making criteria were: Price, Accuracy, and Ease of Communication. We’re going to be leveraging the Qwiic ecosystem that SparkFun provides with their components which removes the need for soldering. They’re also daisy chain able, but with the Qwiic HAT for the Pi, we might not need to do that.

I spent a lot of time envisioning how the different parts of the board will communicate. One concern I had is that the Qwiic HAT will take up all the pins on the RPi’s GPIO header. Two of those pins (UART) would be needed to communicate with the VESC (speed controller). After some further research, I discovered that Pi 5 has a UART connector separate from the pins on the GPIO header. I’m not yet sure if that’ll be sufficient for communicating with the VESC.

Additionally, the GPS and IMU components I’ve been looking at don’t have well documented Python Packages. I had been hoping to use Python but Arduino libraries for these components have better documentation.

PROGRESS:

I’m currently on schedule with my tasks. We planned to have a finalized parts list by Monday (09/23) and we’re on track to completing it after a few discussions with course staff.

NEXT WEEK’S DELIVERABLES:

Next week, I plan to focus on outlining the software stack for the RPi, sensors and motors. I’d like to make an additional block diagram focusing on the software components on the Pi that will connect to the webapp, poll the sensors and signal the VESC(if needed).

I would like to get my hands on a Pi5 as soon as possible so we can begin preliminary tests like connecting to the website, creating logs, etc. We can borrow one from the course inventory so I will put in a request ASAP.