Tio’s Status Report for 12/07/2024

WORK ACCOMPLISHED:

This week, we made significant progress toward finalizing our project. A major milestone was the successful soldering of the U.FL connector onto the Feather M0, enabling improved GPS-RTK functionality. Additionally, we have been debugging and integrating the RTCM stream into our navigation system, a critical step for enhancing positional accuracy.

PROGRESS:

We are entering the final stages of the project and are focusing on refining our existing features. While the LiDAR remains largely retired from the autonomous system, we plan to demonstrate its functionality through an intermediate setup involving a laptop, bridging the Raspberry Pi and the sensor. All other major features are complete, and we are honing the navigation and return algorithms to ensure reliability in the final demo.

NEXT WEEK’S DELIVERABLES:

  • Final Demo: Prepare and refine the features we will showcase to deliver a concise and impactful presentation.
  • Final Poster: Design an engaging and informative poster summarizing our project.
  • Final Report: Finalize the comprehensive report documenting our methodology, results, and lessons learned.

We are on track to deliver a successful project and demonstrate the culmination of our efforts.

Tio’s Status Report for 11/30/2024

WORK ACCOMPLISHED:

The major milestone we crossed since our last status report was our interim demo. We showcased all our work so far, including our mobile application enabled manual control, and script controlled return with turns to avoid a mock obstacle. We also have all parts required for GPS-RTK, and will do some quick assembly before our final demo, but our return algorithm is already implemented.

PROGRESS:

We our entering the final stages of our project and our progress reflects accordingly. We’ve largely retired the LiDAR but hope to be able to demo it attached to a laptop as an intermediate between the device and the Raspberry Pi.

NEXT WEEK’S DELIVERABLES:

  • Final Demo: We need to refine and prepare for what features we will show in our final demo to remain under time and have an effective demo
  • Final Demo Slides
  • Final Poster
  • Final Report

 

Team Status Report for 11/16/2024

Significant Risks + Management
One significant risk this week is that we’ve observed that the readings we receive from our GPS chip without RTK support are far too inaccurate for us to achieve successful autonomous returns. Accordingly we’ve sought advice from Roboclub given their experience using GPS-RTK for robobuggy. We received a list a short list of hardware we needed and have placed orders already.

Design Changes

We will no longer be buying  a deck, and instead we will be 3d printing one using the printers available in Roboclub.

Schedule Changes

Core task schedule remains the same. We are likely going to have to dip into the extra testing time we reserved at the end to get through small items such as LiDAR connection and any Bluetooth mishaps.

Progress

As of this week our project has reached its MVP of being a fully functional electric skateboard that is controlled by a web application, with the ability to accelerate, decelerate, and reverse at the press of a button. Sharon successfully connected the phone applications backend Bluetooth signals to the software on the skateboard’s Raspberry Pi.

As we approach the final physical layout for electronics on the skateboard, Jason has began printing out the enclosure we will use to secure them.

Tio is working with on the location and orientation aspects of the skateboard’s autonomous return features, while Jason has put in many hours focusing on the computer vision aspect.

Sharon worked on integrating the backend Bluetooth signals from the mobile app with the Raspberry Pi on the skateboard. This involved addressing challenges with outdated libraries by utilizing a specific fork of Bleno and implementing a socket-based communication flow, which improved responsiveness and reduced latency for motor control commands. Sharon also worked on refining the GPS accuracy for the ‘Return to Me’ feature by testing various filtering techniques, including the Kalman filter, to smooth out location inconsistencies. Additionally, Sharon refined the app interface, introducing an emergency stop button requiring a double-tap to activate for added safety and removing the reverse functionality to simplify user interactions.

Verification Testing Plans

Speed Test

  • Measurement: 15 mph ± 1 mph
    Test Input: Accelerate and decelerate on varied terrain (flat, inclined), rider weight, and motor load.
    Test Output: Speed should remain within 14-16 mph.
    Risks: Failure to maintain consistent speed and reach top speed.

Battery Efficiency & Range

  • Measurement: 5 miles ± 0.25 miles per charge.
    Test Input: Continuous ride over varied terrain and rider loads (150-240 lbs).
    Test Output: Travel 5 miles on a single charge.
    Risks: Battery drains too quickly, insufficient power for “Return to Me” function.

Return to Me Accuracy

  • Measurement: 80% success rate within 1-meter margin.
    Test Input: Recall skateboard over varying distances (5m, 10m, 50m, etc.) with/without obstacles.
    Test Output: Skateboard returned to user and retrieved within 1-meter margin smoothly.
    Risks: Pathfinding issues due to GPS/IMU inaccuracies.

Obstacle Detection

  • Measurement: Detect objects within 100ms, 90% accuracy.
    Test Input: Set obstacle course with varied object sizes (rocks, trees, etc.).
    Test Output: Skateboard successfully avoids obstacles within design requirements.
    Risks: Slow or no detection at all, especially in fast-moving or small obstacles.

Latency Test

  • Measurement: Command response ≤ 100ms.
    Test Input: Send commands from web app (accelerate, decelerate, etc.).
    Test Output: Response time should be ≤ 100ms.
    Risks: Bluetooth disconnects, delayed execution of commands.

End-to-End Integration

  • Measurement: No interruptions in 2+ mile trip.
    Test Input: Combine all features, ride continuously for 2 miles.
    Test Output: System functions smoothly for the entire trip.
    Risks: Loss of connectivity or inconsistency between components
  • Bluetooth Integration Testing:
    • Test Objective: Ensure that the Bluetooth communication between the mobile app and Raspberry Pi is responsive and reliable.
    • Measurement: Command latency should be ≤100ms.
    • Methodology: Use timestamps in app commands and Raspberry Pi responses to calculate round-trip latency. Test commands (e.g., accelerate, decelerate) across different distances (1m, 5m, 10m) to check for consistency.
    • Anticipated Results: Latency within the design specification, with no more than one disconnect per hour.
    • Analysis Plan: Compare measured latency and disconnect frequency against benchmarks. If the results exceed acceptable limits, investigate interference or library issues for optimization.

 

  • GPS Accuracy Testing:
    • Test Objective: Improve location tracking accuracy to within a 4-meter margin.
    • Measurement: Distance error in meters between actual and reported locations.
    • Methodology: Perform controlled outdoor tests using known fixed points as references. Apply filtering techniques like Kalman filters and compare pre- and post-filtered results.
    • Anticipated Results: Average error ≤4m, with reduced noise in filtered data.
    • Analysis Plan: Evaluate filtered GPS data for consistency and repeatability across multiple tests. Document limitations and refine filters or hardware configurations as necessary.

Tio’s Status Report for 11/16/2024

WORK ACCOMPLISHED:

A lot of progress was made this week, and we can confidently say that we now meet the MVP we defined in our abstract: “A fully functional electric skateboard that is controlled by a web application. You will be able to accelerate, decelerate, and reverse all with just a phone.” Sharon was able to integrate the remote control app with the defined functions for controlling the skateboard, and we’ve been able to verify that the controls are smooth, responsive and intuitive.

I began writing the return algorithm, and worked with Jason to devise a control flowchart to help us with our implementation. While the general approach to its implementation remains unchanged, I am blocked by the inaccuracy of the ZED-F9R when it doesn’t use RTK. To try and rectify this Jason and I met with members of Roboclub for help. They gave us a parts list and some sample code we’ll need. I’ve placed the orders and in the mean time, I will continue as planned and finish the code for the autonomous return.

PROGRESS:

Now that we have reached MVP for our project, I am ahead of schedule for all my non-autonomous feature related task such as writing and testing the motor command class. For the autonomous operation related tasks, I am on track, as the next few weeks are dedicated to testing and revisions.

NEXT WEEK’S DELIVERABLES:

  • Completing the return algorithm: The pseudocode and flowchart are laid out, all that remains is to write the code. This can be accomplished in relatively few work hours
  • Integrating RTK supporting hardware: After consulting with Roboclub we received a list of additional hardware and software to integrate to get highly accurate reading with the GPS we already have via RTK.
  • Validation testing: I aim to carry out some additional unit tests to verify the performance of my contributions to the project, namely:
    • Speed test: Accelerate and decelerate on varying terrain slopes, rider weight, and motor load. Speed should remain within 14-16 mph.
    • Return to Me Accuracy: Recall skateboard over varying distances (5m, 10m, 50m, etc.) with/without obstacles. Skateboard should return to user and be retrieved within 1-meter radius of users position.

 

Tio’s Status Report for 11/09/2024

WORK ACCOMPLISHED

This week marked significant progress in multiple areas of the project. I successfully achieved individual motor control by leveraging Python’s threading module, assigning each motor to its own thread for independent operation. To test this setup, I developed a simple interface that allows motor control using keyboard inputs to simulate signals from the skateboard’s remote control app. Additionally, I implemented an emergency stop routine to ensure the motors and VESC can be safely shut down in case of unexpected issues. I also improved the motor control logic by rewriting the command structure to support reverse motion, enhancing the skateboard’s maneuverability.

With individual motor control established, we conducted our first turning tests in the space between Scaife and TechSpark. Initial results were underwhelming, but after switching to a shorter deck and tightening the trucks, we achieved consistent performance. These adjustments provided the stability needed to proceed confidently with further testing.

Beyond motor control, I integrated functionality for reading GPS data from the ZED-F9R breakout module. I also incorporated a library to convert latitude and longitude coordinates into UTM (Universal Transverse Mercator) format, enabling path-planning calculations in meters on an x-y grid. This advancement will be instrumental in refining our return algorithms.

PROGRESS

While I am on track with my software-related tasks, the overall progress on path-planning and return functionality has fallen slightly behind schedule. This delay is due to several factors, including challenges with the LiDAR integration, underestimating certain technical hurdles, and delays in part deliveries. Despite these setbacks, our motor control system and GPS integration provide a strong foundation for the next phase.

NEXT WEEK’S DELIVERABLES

Next week, our primary objectives are twofold:

  1. Integrating the remote control app: This will involve syncing the app’s signals with the motor control system. To facilitate this, I’ve prepared a brief README outlining the unique aspects of our motor control setup.
  2. Initiating return algorithm testing: Using the GPS and motor control framework, we will begin testing the skateboard’s ability to autonomously return to the user.

These tasks will bring us closer to achieving the project’s core functionality and ensure we remain aligned with our overall timeline.

Team Status Report for 11/2/2024

Significant Risks + Management
One significant risk this week involved challenges with the Raspberry Pi to VESC and GPS connections. Initial attempts to use GPIO for motor control proved unreliable, so we opted to switch to USB communication, which is now working effectively. This change helped address stability issues, though it required adjusting our setup and code. Additionally, there is a risk associated with synchronizing motor commands to achieve simultaneous control, which we plan to mitigate using Python’s multithreading capabilities.

The Intel RealSense LiDAR Camera L515 also poses a challenge. The product’s discontinuation means we’re using outdated SDK versions, which introduced complications with the Raspberry Pi’s Linux environment. Building the SDK from source has proven necessary, which could further delay integration. The team is prioritizing the completion of the LiDAR setup on the Raspberry Pi to avoid future bottlenecks.

Design Changes

No major design changes were implemented this week. However, based on the LiDAR’s outdoor performance limitations, we are considering relaxing our object detection requirements. Instead of general obstacle avoidance, we may focus on recognizing a specific object, such as a traffic cone, to simplify our proof of concept. This shift would allow us to use the LiDAR’s RGB camera with OpenCV for object detection and refine our recognition parameters later if required.

Schedule Changes

The project schedule has been updated to account for the switch to USB communication for both the VESC and GPS modules. This modification may shift certain tasks, like debugging multithreaded motor control, into the upcoming week. Additionally, the timeline reflects ongoing efforts to migrate LiDAR code to the Raspberry Pi. Despite these adjustments, core tasks such as motor control development and return algorithm testing remain on track with project milestones.

Progress
This week, the team achieved individual control over the motors through USB connections, addressing GPIO-related stability issues. The plan now is to synchronize motor commands using multithreading to enable simultaneous operation. The team also began building a Python class for motor control, which will centralize all control methods.

For the LiDAR, we encountered difficulties due to the need to build the SDK from source on the Raspberry Pi’s Linux environment. However, initial tests with the LiDAR’s RGB camera showed reliable object detection indoors, leading us to consider narrowing our vision requirements.

We  soldered connections for the battery, VESC, and power switch, completing crucial steps toward the skateboard’s assembly. While GPS data collection has been postponed until the USB setup is finalized, we are prepared to implement the return algorithms as soon as the board assembly is complete.

We completed the backend server setup on the mobile app, enabling it to communicate effectively with the Raspberry Pi. This involved creating key API endpoints for controlling and monitoring the skateboard, as well as connecting these endpoints to the app’s frontend. This setup establishes a solid foundation for real-time interaction and prepares the app for seamless control over skateboard components.

 

 

Also, added new functionality allowing users to control skateboard actions via the phone’s volume buttons and ringer controls. By setting up listeners for these hardware buttons, we enabled alternative control options within the app, making adjustments more intuitive and accessible without relying solely on touchscreen inputs. This feature provides added convenience, especially for quick or hands-free adjustments.

Lastly,  to enhance the app’s location-based capabilities, we implemented GPS functionality using a React Native library, configuring it for accurate tracking and smooth operation. We are extensively testing by adjusting various settings within the library to improve location precision. Additionally, we set up frontend permissions to enable GPS access, providing users with a reliable, permission-secured experience during location tracking and navigation.

 

 

 

Tio’s Status Report for 11/2/2024

WORK ACCOMPLISHED:

This week, I worked with Jason on the VESC, Motors and Raspberry Pi.  We spent quite a bit of time debugging the connections between the Pi and VESC. Eventually we abandoned using the GPIO pins in favor of using USB connections. We will be writing the scripting for the motors using a conjunction of the PyVESC and serial python modules. In similar fashion, we plan to forego the Qwiic connectors and use a USB cable, UBlox_GPS, and serial to manage control the GPS.

We’ve achieved individual control over the motors, but we’re still trying to execute instructions simultaneously. I plan to use Python’s multithreading features to accomplish this.

The LiDAR is not yet fully operational on the Pi. Due to its Linux environment, we have to build the SDK from its source files rather than just installing it.

PROGRESS:

I am on track with my tasks for the week, which focused on defining a Python Class for the motors entailing all the methods we will be using to control the motors. We initially planed to have return completed by this week but that was an ambitious deadline that was set to leave extra cushion time towards the end. We can slightly relax our expectations, especially because the motor commands are taking much less time than anticipated.

In experimenting with the LiDAR I’ve found that its depth camera performs very poorly outdoors. It’s RGB camera is fine however, and already capable of identifying faces and outlining them in a red bounding box. This gave me an idea to propose relaxing our vision requirements to the recognition of a single object (e.g. a traffic cone) as a proof of concept. This is fairly easy to accomplish using opencv and a public data set for recognition.

NEXT WEEK’S DELIVERABLES

  • Rewrite GPS data collection on the Raspberry Pi to use USB serial ports.
  • Complete the migration of LiDAR code to the Raspberry Pi environment to ensure seamless operation
  • Begin testing the return algorithms once the board is fully assembled, using LiDAR data to initiate brakes.

Tio’s Status Report for 10/26/2024

WORK ACCOMPLISHED:

This week, I focused on getting the Intel RealSense L515 LiDAR operational. Due to the product being discontinued, I faced challenges with outdated documentation. However, I was able to resolve these issues by installing RealSense SDK v2.50 and pyrealsense module v2.54, which are the latest versions compatible with the L515.

I also wrote a Python program to detect obstructions by raising a signal when a configurable percentage of the LiDAR’s view is blocked within a certain distance. These parameters will be fine-tuned during integration testing to ensure reliable object detection. In addition, I assisted Jason with soldering the battery, VESC, and power switch connections, contributing to the hardware assembly process.

PROGRESS:

I am on track with my tasks for the week, which focused on object detection using the LiDAR. However, I still need to migrate the LiDAR code to the Raspberry Pi environment, which will be a priority moving forward.

I haven’t been able to complete the motor functions for the board, because there was a slight delay in the arrival of one of our parts. The board has not been fully assembled yet.

NEXT WEEK’S DELIVERABLES

  • Set up GPS data collection on the Raspberry Pi.
  • Begin testing the return algorithms once the board is fully assembled, integrating LiDAR data for autonomous recall.
  • Complete the migration of LiDAR code to the Raspberry Pi environment to ensure seamless operation.

Team Status Report for 10/26/2024

Significant Risks + Management

One significant risk this week involved challenges with the Intel RealSense LiDAR Camera L515. Since the product is discontinued, the team faced issues with outdated documentation and incompatible software versions. This risk was managed by identifying compatible versions of the RealSense SDK (v2.50) and the pyrealsense module (v2.54), ensuring the LiDAR can still be integrated into the project.

There is also a risk associated with migrating the LiDAR code to the Raspberry Pi environment. This task is currently in progress, but further delays could impact the testing schedule. The team is prioritizing this migration to avoid potential bottlenecks.

Design Changes

No significant design changes were made this week. However, the team has discussed the continued relevance of backend functionality. If backend development is pursued, it may require additional modifications to the web application and data management strategies.

Schedule Changes

The project timeline has been adjusted to reflect the delay in LiDAR migration to the Raspberry Pi. Despite these setbacks, critical tasks such as sensor integration, motor control development, and return algorithm testing remain aligned with the overall project schedule.

Progress

This week, the team successfully configured the LiDAR sensor by resolving software compatibility issues. The necessary versions of the SDK and Python libraries were identified and installed, allowing the team to write a program that raises a signal if a variable percentage of the LiDAR’s view is blocked within a specified distance. These parameters will be further refined during integration testing.

The team also soldered the battery, VESC, and power switch connections, ensuring the propulsion system is ready for assembly. While LiDAR functionality has been tested on desktop, migration to the Raspberry Pi environment is ongoing. In parallel, GPS integration frameworks are prepared to ensure smooth setup once missing components arrive.

The team also completed the Bluetooth setup on the Raspberry Pi, enabling automatic BLE service activation on boot. This functionality was tested successfully, allowing the app to find and pair with the Pi reliably.

Tio’s Status Report for 10/19/2024

WORK ACCOMPLISHED:

The initial parts required for SkateBack have been procured. Before the break, I picked up key components we ordered from the course inventory, including the Intel RealSense LiDAR Camera L515, Raspberry Pi 4, and the SparkFun GPS-RTK Dead Reckoning Breakout (ZED-F9R).

I began the initial setup for the Raspberry Pi 4. Additionally, I tested the LiDAR sensor on my desktop computer and installed the RealSense software on a Windows device to familiarize myself with its operation. I also explored how to run the Intel-provided SDK in a Linux environment and reviewed sample Python code to understand how to interface the LiDAR with the Raspberry Pi effectively.

Unfortunately, I was unable to test the GPS module as the required antenna and connectors to interface it with the Raspberry Pi or other computers are still pending. However, this hasn’t affected progress, as I have continued studying GPS operation and preparing code frameworks to interface with it once the missing parts arrive.

Additionally, we finalized most of our system design in the design report, which now serves as a critical reference document moving forward. The decisions documented there will guide component integration and help align all team members on the project’s direction.

PROGRESS:

I am currently on track with my tasks for the week, focusing primarily on setting up the Raspberry Pi environment and experimenting with sensor data collection. The initial tests with the LiDAR have provided insights into its data streams and will inform the next phase of development.

Moreover, I received notification that additional parts from our order have been delivered. This will allow us to begin assembling the physical skateboard and integrating components next week. With more hardware in hand, the team can shift attention toward building the board and testing individual subsystems.

NEXT WEEK’S DELIVERABLES:

My deliverables for next week are as follows:

  • Test data collection from the sensors (GPS and LiDAR)
  • Define a library of motor functions for the motors
  • Work on sending object presence to the Raspberry Pi
  • Work on the back up algorithm for obstacle avoidance