Tio’s Status Report for 11/16/2024

WORK ACCOMPLISHED:

A lot of progress was made this week, and we can confidently say that we now meet the MVP we defined in our abstract: “A fully functional electric skateboard that is controlled by a web application. You will be able to accelerate, decelerate, and reverse all with just a phone.” Sharon was able to integrate the remote control app with the defined functions for controlling the skateboard, and we’ve been able to verify that the controls are smooth, responsive and intuitive.

I began writing the return algorithm, and worked with Jason to devise a control flowchart to help us with our implementation. While the general approach to its implementation remains unchanged, I am blocked by the inaccuracy of the ZED-F9R when it doesn’t use RTK. To try and rectify this Jason and I met with members of Roboclub for help. They gave us a parts list and some sample code we’ll need. I’ve placed the orders and in the mean time, I will continue as planned and finish the code for the autonomous return.

PROGRESS:

Now that we have reached MVP for our project, I am ahead of schedule for all my non-autonomous feature related task such as writing and testing the motor command class. For the autonomous operation related tasks, I am on track, as the next few weeks are dedicated to testing and revisions.

NEXT WEEK’S DELIVERABLES:

  • Completing the return algorithm: The pseudocode and flowchart are laid out, all that remains is to write the code. This can be accomplished in relatively few work hours
  • Integrating RTK supporting hardware: After consulting with Roboclub we received a list of additional hardware and software to integrate to get highly accurate reading with the GPS we already have via RTK.
  • Validation testing: I aim to carry out some additional unit tests to verify the performance of my contributions to the project, namely:
    • Speed test: Accelerate and decelerate on varying terrain slopes, rider weight, and motor load. Speed should remain within 14-16 mph.
    • Return to Me Accuracy: Recall skateboard over varying distances (5m, 10m, 50m, etc.) with/without obstacles. Skateboard should return to user and be retrieved within 1-meter radius of users position.

 

Sharon’s Status Report for 11/16/24

WORK ACCOMPLISHED:

Bluetooth Integration and Motor Control Setup:
This week, I focused on connecting the Raspberry Pi to the mobile app via Bluetooth to control the skateboard’s motors. One challenge was that many Bluetooth packages for Node.js were outdated and lacked community support. After extensive troubleshooting, I found a specific fork of the Bleno library that supported my requirements, allowing me to set up a functional BLE server on the Pi.

Initially, I attempted to call my teammates’ Python motor control scripts directly from the JavaScript BLE server. However, this approach was not seamless and introduced unnecessary delays. To optimize the workflow, I restructured the system by implementing sockets for motor control. The BLE server now forwards commands from the mobile app to the socket server, which listens on the same port and directly communicates with the motors. This solution reduced network overhead and significantly improved latency, creating a smoother and more responsive control experience.

Frontend Refinements:
I refined the app’s interface to enhance usability and safety. The emergency stop button now requires users to press it twice to prevent accidental activation, providing a crucial safety measure during operation. Additionally, I removed the reverse function from the control UI to simplify interactions, based on user feedback and testing insights.

GPS Accuracy Testing and Improvements:
Continuing work on the ‘Return to Me’ feature, I tested the phone’s GPS capabilities alongside the purchased GPS module. Outdoor testing showed an accuracy of up to 4 meters under favorable conditions. To improve results, I applied aggressive filtering techniques and implemented a Kalman filter to smooth out inconsistent data points. While this approach showed promise, testing was limited due to inclement weather, leaving room for further optimization. These efforts move us closer to reliable location tracking for the return feature.

Dependency Updates and Maintenance:
Maintaining the app involved updating several dependencies to ensure compatibility and stability. This task was critical to resolving issues introduced by outdated packages and improving the overall development environment.

PROGRESS:

The integration of the BLE server with socket-based motor controls marks a significant improvement in responsiveness and system efficiency. The updated frontend design prioritizes user safety while maintaining functionality, and the progress in GPS accuracy ensures we are on track for reliable navigation. These accomplishments strengthen the foundation for real-time control and navigation, aligning with our project timeline and goals.

NEXT WEEK’S DELIVERABLES:

  • Finalize motor control testing to ensure seamless and reliable operation under various conditions and define acceleration/deceleration curve with teammates.
  • Continue testing the Kalman filter and experiment with additional methods to enhance GPS precision, prioritizing outdoor trials when weather permits.
  • Begin setting up WebSocket connections to allow continuous data streaming from the Pi to the app, enabling real-time monitoring and control.
  • Connecting the GPS functionality and return to me feature from teammates.

Jason’s Status Report for 11/9/2024

WORK ACCOMPLISHED:

This week, we experimented significantly with turning and began to connect more extraneous pieces together. In the turning department, we initially struggled with our old deck and acceleration step of 0.01 duty cycle. I read online that in order to turn properly, we should consider lowering our duty cycle to give the wheels a better chance to grip the ground which ended up helping significantly. We also shortened the wheelbase by swapping the deck out with a deck I had at home, which also improved our steering by a landslide. We are now able to make stable right and left turns. We have also decided to return to the user from the back (as in the back wheels when riding will lead when returning) as this makes it easier with steering. We have also made strides with GPS routing.

As we swapped LiDARs this week, I have spent the last 2ish days trying in vain to connect the LiDAR to the Pi. I have followed a few tutorials and will be asking for help if I cannot accomplish this task in the next day or so. The documentation is shaky at best as you have to make a very custom package build on the pi to allow pysense (intel real sense on a pi) to run properly.

PROGRESS:

We are still a little behind schedule. The blockage of the pi connection to the LIDAR has presented another unforeseen issue in our progress forward. While we have delivered on all of the things promised last week, I would like us to be able to detect objects in the next few days so we can figure out our path planning and start testing that aspect.

NEXT WEEK’S DELIVERABLES

We will have some path planning/ return features implemented. We will also be directly controlling/testing from the phone at that point.

Tio’s Status Report for 11/09/2024

WORK ACCOMPLISHED

This week marked significant progress in multiple areas of the project. I successfully achieved individual motor control by leveraging Python’s threading module, assigning each motor to its own thread for independent operation. To test this setup, I developed a simple interface that allows motor control using keyboard inputs to simulate signals from the skateboard’s remote control app. Additionally, I implemented an emergency stop routine to ensure the motors and VESC can be safely shut down in case of unexpected issues. I also improved the motor control logic by rewriting the command structure to support reverse motion, enhancing the skateboard’s maneuverability.

With individual motor control established, we conducted our first turning tests in the space between Scaife and TechSpark. Initial results were underwhelming, but after switching to a shorter deck and tightening the trucks, we achieved consistent performance. These adjustments provided the stability needed to proceed confidently with further testing.

Beyond motor control, I integrated functionality for reading GPS data from the ZED-F9R breakout module. I also incorporated a library to convert latitude and longitude coordinates into UTM (Universal Transverse Mercator) format, enabling path-planning calculations in meters on an x-y grid. This advancement will be instrumental in refining our return algorithms.

PROGRESS

While I am on track with my software-related tasks, the overall progress on path-planning and return functionality has fallen slightly behind schedule. This delay is due to several factors, including challenges with the LiDAR integration, underestimating certain technical hurdles, and delays in part deliveries. Despite these setbacks, our motor control system and GPS integration provide a strong foundation for the next phase.

NEXT WEEK’S DELIVERABLES

Next week, our primary objectives are twofold:

  1. Integrating the remote control app: This will involve syncing the app’s signals with the motor control system. To facilitate this, I’ve prepared a brief README outlining the unique aspects of our motor control setup.
  2. Initiating return algorithm testing: Using the GPS and motor control framework, we will begin testing the skateboard’s ability to autonomously return to the user.

These tasks will bring us closer to achieving the project’s core functionality and ensure we remain aligned with our overall timeline.

Team Status Report for 11/9/2024

Significant Risks + Management
One significant risk this week was a spark that occurred when testing in tech spark. Exposed metals on the pos and neg sides of our battery connection to the VESC ended up touching by accident, creating a spark and slightly burning the wire around it. No one was injured nor were any parts damaged in any way. To solve this problem, we have heated the shrink wrap previously set in place to ensure there is no longer any exposed metal.

Another area of risk is the connection from the pi to the LiDAR. For some reason, building (running make-j1) or make-j4) takes about 45 minutes to complete when the command in the tutorials I am watching takes <1 minute. I am attributing this primarily to all the other packages we had to install to get bluetooth, pyvesc, etc to work but it is making integration and trial and error very difficult.

 

Third area of risk is the GPS accuracy as we were not able to get sufficient accuracy readings in our last meeting due to poor weather conditions. We did not want to risk damaging parts in the rain.

Design Changes

We have changed LiDAR from d515 to d455 as it is meant to be used outdoors and thus more suited to our requirements.  We have also swapped our deck to be about 4 inches shorter to allow for better turning when autonomous.

Schedule Changes

Core task schedule remains the same. We are likely going to have to dip into the extra testing time we reserved at the end to get through small items such as LiDAR connection and any bluetooth mishaps.

Progress
This week, we achieved our first succesful turn after previous failures and inconsistencies. By shortening the wheel base and decreasing the acceleration step, we are able to perform a consistent and relatively tight turn. The only asterix here is that we are noticing significant wear on the edges of the powered tires, likely from shuffling during turns. These turns also leave visible tire marks on the ground.

We also swapped our LiDAR and ran tests on the laptop. We were able to get a much clearer image and much more accurate LiDAR readings in key environments where the other LiDAR was failing (outside). This allows us to be much more accurate to our original design requirements as we begin integration/ implementation.

We also achieved our first test with a rider on board. Both Jason and Tio took a turn riding with some acceleration applied to the back wheels. The ordeal went well, the board handles well and the acceleration seems to be very smooth and controllable. The board turns very well and we have not experienced any wheel bite or speed wobbles. The next step will be to test on inclines and at higher speeds.

We also shrink wrapped a variety of loose items on the board to ensure no further accidents occur while testing or while anyone is actually riding.

We also made substantial progress on enhancing the GPS accuracy for our ‘Return to Me’ feature. We conducted extensive testing using the React Native geolocation library, comparing the app’s GPS readings to those from the hardware GPS on the skateboard, which we set as our ground truth. Indoor tests revealed some limitations in accuracy, whereas outdoor tests provided a precision of around 5 meters, aligning with our initial requirements for outdoor navigation. We are now considering snapshotting the user’s location to improve the consistency of the ‘Return to Me’ feature. Further testing and adjustments are planned to maximize GPS reliability and meet our accuracy goals under various environmental conditions.

Additionally, we set up a Bluetooth backend server on the Raspberry Pi to enable real-time communication between the mobile app and the board. This setup encountered compatibility challenges due to outdated Bluetooth libraries, which required troubleshooting across multiple Node.js versions to establish a stable configuration. After resolving these issues, we successfully connected the app to the server, allowing button presses on the mobile app to trigger responses from the Pi. This initial setup enables basic control and will facilitate a seamless transition to app-based control of the Raspberry Pi for teammates as they complete their individual testing, ultimately streamlining interaction with the skateboard’s hardware.

Sharon’s Status Report for 11/9/24

WORK ACCOMPLISHED:

GPS Location Testing for ‘Return to Me’ Feature:
This week, I focused on refining the GPS location tracking for the ‘Return to Me’ feature, specifically testing the React Native geolocation library to assess its accuracy when integrated into the app. To establish a benchmark, I set the skateboard’s GPS as the ground truth. Through testing, I found that indoor accuracy was less reliable, especially when compared to the hardware data. Using the Apple longitude and latitude APIs revealed limitations in precision, even though they provided more digits in the coordinates. Outdoor testing yielded better results, with an accuracy of approximately 5 meters, and we are considering snapshotting the user’s location for the return feature. I will continue exploring methods to enhance GPS precision and conduct additional tests to ensure reliable navigation.

Bluetooth Backend Server Setup on Raspberry Pi:
In parallel, I worked on establishing a Bluetooth backend server on the Raspberry Pi to enable control from the mobile app. This process encountered several challenges, as many Bluetooth libraries were outdated, and only specific older Node.js versions supported Bluetooth socket functionality effectively. After troubleshooting compatibility issues, I successfully set up the server on the Pi. I also initiated connection testing between the app and Pi, allowing initial button presses on the app to trigger responses from the Pi. This setup aims to facilitate seamless control of the Raspberry Pi via the mobile app, enabling my teammates to shift from terminal-based commands to app-based control as they complete testing on individual components.

PROGRESS:

With enhanced GPS testing and a functional Bluetooth server setup, the app is progressively prepared for full control and navigation capabilities. Testing the React Native geolocation library provided valuable insights into the strengths and limitations of GPS tracking across different environments. The Bluetooth backend setup on the Raspberry Pi also establishes a foundational control link between the mobile app and hardware, simplifying interactions for further integration with the skateboard. These developments create a stable base for expanded real-time functionality and keep us aligned with the project timeline.

NEXT WEEK’S DELIVERABLES:

  • Continue refining the GPS accuracy for the ‘Return to Me’ feature by implementing additional testing strategies to optimize location tracking, including snapshotting techniques.
  • Collaborate with teammates to finalize control mechanisms over Bluetooth, refining button responses and commands on the app to ensure seamless interaction with the skateboard’s components.
  • Begin exploring WebSocket integration for continuous data streaming from the Pi, allowing real-time feedback and control.
  • Test the Bluetooth setup further by connecting more app functionality with Pi feedback, working closely with teammates as they conclude individual testing, aiming for complete app-based control by the end of the week.

Team Status Report for 11/2/2024

Significant Risks + Management
One significant risk this week involved challenges with the Raspberry Pi to VESC and GPS connections. Initial attempts to use GPIO for motor control proved unreliable, so we opted to switch to USB communication, which is now working effectively. This change helped address stability issues, though it required adjusting our setup and code. Additionally, there is a risk associated with synchronizing motor commands to achieve simultaneous control, which we plan to mitigate using Python’s multithreading capabilities.

The Intel RealSense LiDAR Camera L515 also poses a challenge. The product’s discontinuation means we’re using outdated SDK versions, which introduced complications with the Raspberry Pi’s Linux environment. Building the SDK from source has proven necessary, which could further delay integration. The team is prioritizing the completion of the LiDAR setup on the Raspberry Pi to avoid future bottlenecks.

Design Changes

No major design changes were implemented this week. However, based on the LiDAR’s outdoor performance limitations, we are considering relaxing our object detection requirements. Instead of general obstacle avoidance, we may focus on recognizing a specific object, such as a traffic cone, to simplify our proof of concept. This shift would allow us to use the LiDAR’s RGB camera with OpenCV for object detection and refine our recognition parameters later if required.

Schedule Changes

The project schedule has been updated to account for the switch to USB communication for both the VESC and GPS modules. This modification may shift certain tasks, like debugging multithreaded motor control, into the upcoming week. Additionally, the timeline reflects ongoing efforts to migrate LiDAR code to the Raspberry Pi. Despite these adjustments, core tasks such as motor control development and return algorithm testing remain on track with project milestones.

Progress
This week, the team achieved individual control over the motors through USB connections, addressing GPIO-related stability issues. The plan now is to synchronize motor commands using multithreading to enable simultaneous operation. The team also began building a Python class for motor control, which will centralize all control methods.

For the LiDAR, we encountered difficulties due to the need to build the SDK from source on the Raspberry Pi’s Linux environment. However, initial tests with the LiDAR’s RGB camera showed reliable object detection indoors, leading us to consider narrowing our vision requirements.

We  soldered connections for the battery, VESC, and power switch, completing crucial steps toward the skateboard’s assembly. While GPS data collection has been postponed until the USB setup is finalized, we are prepared to implement the return algorithms as soon as the board assembly is complete.

We completed the backend server setup on the mobile app, enabling it to communicate effectively with the Raspberry Pi. This involved creating key API endpoints for controlling and monitoring the skateboard, as well as connecting these endpoints to the app’s frontend. This setup establishes a solid foundation for real-time interaction and prepares the app for seamless control over skateboard components.

 

 

Also, added new functionality allowing users to control skateboard actions via the phone’s volume buttons and ringer controls. By setting up listeners for these hardware buttons, we enabled alternative control options within the app, making adjustments more intuitive and accessible without relying solely on touchscreen inputs. This feature provides added convenience, especially for quick or hands-free adjustments.

Lastly,  to enhance the app’s location-based capabilities, we implemented GPS functionality using a React Native library, configuring it for accurate tracking and smooth operation. We are extensively testing by adjusting various settings within the library to improve location precision. Additionally, we set up frontend permissions to enable GPS access, providing users with a reliable, permission-secured experience during location tracking and navigation.

 

 

 

Tio’s Status Report for 11/2/2024

WORK ACCOMPLISHED:

This week, I worked with Jason on the VESC, Motors and Raspberry Pi.  We spent quite a bit of time debugging the connections between the Pi and VESC. Eventually we abandoned using the GPIO pins in favor of using USB connections. We will be writing the scripting for the motors using a conjunction of the PyVESC and serial python modules. In similar fashion, we plan to forego the Qwiic connectors and use a USB cable, UBlox_GPS, and serial to manage control the GPS.

We’ve achieved individual control over the motors, but we’re still trying to execute instructions simultaneously. I plan to use Python’s multithreading features to accomplish this.

The LiDAR is not yet fully operational on the Pi. Due to its Linux environment, we have to build the SDK from its source files rather than just installing it.

PROGRESS:

I am on track with my tasks for the week, which focused on defining a Python Class for the motors entailing all the methods we will be using to control the motors. We initially planed to have return completed by this week but that was an ambitious deadline that was set to leave extra cushion time towards the end. We can slightly relax our expectations, especially because the motor commands are taking much less time than anticipated.

In experimenting with the LiDAR I’ve found that its depth camera performs very poorly outdoors. It’s RGB camera is fine however, and already capable of identifying faces and outlining them in a red bounding box. This gave me an idea to propose relaxing our vision requirements to the recognition of a single object (e.g. a traffic cone) as a proof of concept. This is fairly easy to accomplish using opencv and a public data set for recognition.

NEXT WEEK’S DELIVERABLES

  • Rewrite GPS data collection on the Raspberry Pi to use USB serial ports.
  • Complete the migration of LiDAR code to the Raspberry Pi environment to ensure seamless operation
  • Begin testing the return algorithms once the board is fully assembled, using LiDAR data to initiate brakes.

Jasons’s Status Report for 11/2/24

WORK ACCOMPLISHED:

This week, we focused on getting motor movement from a python script. While we struggled with this initially, we managed to get a response once we switch from connecting from the GPIO pins on the Rpi to the COMM ports on the vesc to using the USB ports on both. Prior to this, we attempted sending signals via pyvesc, sending independent UART signals, and sending PWM signals. We are now able to control each motor independently and I have tinkered with acceleration and deceleration curves and have tuned to a point where we can begin testing. I will further refine once we have mounted. Currently, I have a test bench in which I control motors with keyboard interactions. The only thing left to solve in simultaneously control which I believe we will be able to do with multithreaded listening for commands for each motor.

PROGRESS:

We are a little behind schedule. The documentation for pyvesc is very skim and it appears projects like this are quite rare. As a result, it is taking longer than expected to navigate this package and figure out some intricacies.  Getting a response from the motors is a huge step forward and unlocks so much more for us now. We will likely be able to make very large bounds in the next few weeks.

NEXT WEEK’S DELIVERABLES

We will have app to motor response, rudimentary turning, and a jank assembly of the board complete.

Sharon’s Status Report for 11/2/24

WORK ACCOMPLISHED:

Backend Setup and API Integration: This week, my primary focus was on establishing a complete backend server setup that enables the mobile app to connect and communicate with the Raspberry Pi seamlessly. I developed API endpoints for essential commands and data exchanges, connecting these endpoints to the frontend to provide a responsive and cohesive interface. So basically defining more of the flow of the web app and how the commands to the skateboard will work. So now when you hit the button, it’ll send a HTTP request using Axios to the defined API endpoint which will also be on the Raspberry Pi server. Additionally, I set up a dedicated server on the Pi, allowing it to handle requests from the app efficiently. This backend foundation is a key milestone, as it allows for real-time control functionality and prepares the app for more complex interactions with the skateboard’s hardware. Once my teammates are ready and finished with their individual testing of the parts, we can connect the two systems so that when an action occurs on the frontend, it will go through the backend and function in real life. Also, defined that using Bluetooth the highest latency will be 100ms around so that’s good.

GPS Integration and Testing: I am currently testing GPS functionality using a React Native library, focusing on enhancing location accuracy for a smoother navigation experience. Configuring and testing various settings within the library allowed me to assess its performance under different conditions, aiming for precise location tracking. Permissions were set up on the frontend to streamline user access to GPS features, creating a user-friendly experience. Testing different configurations gave insights into optimal settings, which will improve the app’s navigation responsiveness. Also, looking into more of algorithms or signal smoother I can do to make it more accurate and seeing if I can get a longer and consistent stream of data in order to get a more accurate location on the phone.

Volume and Ringer Control Functionality: To enhance control options within the app, I also added functionality for the phone’s volume buttons and ringer, allowing users to control specific skateboard actions using these hardware features. I set up listener functions within the app to detect volume button presses, mapping these to skateboard controls, and adjusted the ringer settings to allow control sound feedback as needed. This additional control layer provides users with an intuitive way to manage interactions without relying solely on touchscreen inputs, especially useful in active scenarios where quick adjustments may be necessary.


PROGRESS:

With the backend server, GPS functionality, and volume control setup complete, the app is well-prepared for expanded interactions with the Raspberry Pi. The API endpoints and server setup on the Pi allow for smooth data and control flow, and the GPS implementation provides accurate location tracking. Testing the React Native GPS library was insightful, allowing me to fine-tune settings for maximum precision, while the volume and ringer control adds a valuable layer of user interaction. These accomplishments set up a stable platform for further real-time control developments, keeping us on track with the project timeline and hopefully allowing us more time to test when finish assembling the skateboard. I have set up this foundation so it would be easier and allow a smooth transition of integrating with the actual hardware in preparation for the completion of my teammate’s work.


NEXT WEEK’S DELIVERABLES:

Next week, I aim to complete the following:

  • Finalize the testing of API endpoints for button controls, working with teammates to fine-tune parameters like acceleration and deceleration when they complete their individual testing.
  • Once my teammates have been able to control their specific parts, I need to set up WebSockets and the backend on the Raspberry Pi to get the streamed data onto the app. I am waiting for them to see if they can get data from the hall sensors.
  • Something we will work on is how the control of the remote will work on the volume buttons so will it be levels or more like a traditional remote controller where you can long hold the buttons.
  • Continue working on the GPS coordinates of the phone and then be able to integrate with the LIDAR so we can start path planning.