Jack Wang’s Status Report for 4/6/24

Personal Accomplishments:

  1. Interim Demo Set Up (2 hrs): I spent some time this week setting up the radar portion of the interim demo.  In addition to plotting the data points, I printed out the possible warning types (collision/blind spot) based on the location of the target. I then presented the setup on Monday.
  2. Radar Tuning and UI Integration (10 hrs): I spent the majority of time last week processing the radar signals to make them useful for our needs and integrating the radar with the UI that Jason has been doing. I discussed with Jason about what is the relevant data that should be sent to the UI and how to pack the data. We agreed that I would be doing the majority of the data processing. Specifically for the rear radar, I will divide the region into three sectors as discussed before. For the areas of blind spot detection, I will send a positive flag if the radar detects an object within the threshold. This is because the UI only cares if there is an object approaching the “blind spot” and does not require detailed information; for the region of rear collision detection, I will send over the object that has the closest absolute distance to the bike, where the UI will be using the distance and velocity information to alert the rider of a possible collision. I discovered that we could use Named Pipe to send the information between my radar processing script in Python and Jason’s UI implementation in JavaScript. I packed the data into JSON format and did some basic testing with Jason to verify the communication. As of Friday, we were able to transfer data from the radar detection script to the UI.

Progress:

I finished the radar tuning and some basic UI integration, so I am on track for now. I will be taking over Jason’s task of tuning the forward radar, given the similarity of its functionality to the rear radar. This is to keep us on track with system integration. The latest schedule in our group report reflected this change.

Verification

  1. Basic Radar Detection: I have run some tests to benchmark the basic functionality of the radar. This was done by setting up the radar in a controlled environment and having people running/walking toward the radar from different angles to simulate incoming traffic. Using the plot generated, I verified that the radar could accurately detect the approaching people with reasonable distance output. The result is analyzed by recording the output x-y distance that the radar is reporting and comparing them with the actual location of the people, which is measured with meter sticks. This data is used to benchmark the metrics mentioned in the Radar Accuracy Baseline section of the design report. This verification task indicated the basic functional success of my radar implementation.
  2. Integration Verification: I verified that the output data of the radar was correctly communicated with the UI. This was done by printing out the raw data in the radar processing script and the information that the UI received. This is to make sure that the system is updating data correctly, which indicates the functionality of the communication pipeline.
  3. Radar Detection in Real World Environment: This will be a verification task that I will do soon, since I just finished tuning the rear radar. The goal of this verification task is to make sure that the radar will provide the desired detection results in the real traffic environment. I will mount the radar onto the bike and drive a car approaching the bike in a parking lot. I will then analyze the result to see if the radar output makes sense. This includes comparing the distance and the velocity output of the radar with the ground truth data, which is measured by tape measure and speedometer.

Next Week:

  1. Integration
  2. FCW radar tuning

Jason Lu’s Status Report for 4/6

This Week’s Work

  • Implemented physical control of turn signal lights
  • Implemented auto-cancellation using magnetometer
  • Started integration with Jack’s radar code

For details, please see the “State of the UI” section below.

Schedule

Out of the two deliverables from last week, I partially completed one (in orange) and did not make any progress on the last task (in red).

  1. Complete implementation of turn signal control
  2. Complete radar implementation

The only real work remaining for the turn signal stuff on my end is to mitigate the issue where we need to physically turn the compass 90 degrees for it to report a change of 60 degrees – I have a potential workaround (see below)

However, I am behind on the radar implementation task. I will check with Jack on whether this is something he can take on, since theoretically we can reuse the stuff he does for the RCW specifically for the FCW. I’m also realizing it is somewhat ill-defined what radar implementation is – I think a good definition at this point is to have code to filter the points from the radar, identify which of them correspond to a vehicle directly in front (for FCW), and send the distance and relative velocity of that car to the UI.

Upcoming Deliverables

The deliverables for this week are similar to last week’s, except changing “radar implementation” to “radar and UI integration”.

  1. Complete implementation of turn signal control
  2. Complete radar and UI integration

State of the UI

Here’s a more detailed rundown of what the state of the UI is as of the time of this writing.

Turn Signal Lights

I implemented code to drive the actual turn signal lights through GPIO pins which in turn control transistor gates. There is a minor issue where the on-screen turn signal indicator isn’t synchronized with the physical turn signal light, but for now it should be OK.

Turn Signal Auto-Cancellation

For our turn signal auto-cancellation system, we are using the Adafruit MMC5603 Triple-Axis Magnetometer as a compass. As a quick refresher, the idea of using the compass is that when we activate a turn signal, we record the current bicycle heading. Then, once the bicycle turns beyond a certain angle (currently targeting +- 60 degrees of the starting heading) we automatically turn off the turn signals.

Adafruit provides libraries for Arduino and Python, but since the UI code is written in Javascript there weren’t any pre-written libraries. Therefore, I ended up writing my own code to interface with it.

As mentioned last week, I’m using the i2c-bus library to communicate with the MMC5603 chip over I2C. With help from the MMC5603 datasheet and Adafruit’s C++ library for the MMC5603, we now have support for obtaining the raw magnetic field measurements from the MMC5603.

To convert those raw readings into a heading, I implemented the algorithm described in this Digilent article.

There is an issue where rotating the magnetometer reports 60 degrees of angle change only when it’s actually rotated 90 degrees. We can work around this in code right now by setting a lower threshold to auto-cancel the turn signal (perhaps 40 degrees so it’ll turn off when the bicycle rotates 60 degrees physically), but we may need calibration. From what I briefly read online though, this won’t be a simple process.

I also implemented the ability to override the bike angle to some fake angle in the debug menu. Here’s a video I recorded (a CMU account is required) showing the overriding angle feature and the turn signal auto-cancellation.

Integration with radar

The UI side is ready to receive real radar data! The general overview is that Jack’s code will do all the data processing, identify one vehicle each in front, behind, left, and right, and send their distances and velocities in a JSON string to my program. We will be using named pipes, created using mkfifo.

On the UI side, I based my code off of this StackOverflow answer. There was a slight hiccup initially when we were testing passing data from Python to JavaScript when the Python side seemed to buffer multiple JSON strings before actually sending them over the pipe, resulting in a crash on the Javascript side as it received a single string with multiple objects at the top level which isn’t valid JSON. We fixed this by inserting a flush call after sending each string on the Python side.

 

 

Team Status Report for 3/30

Risk and Plans

We are making good progress on the software, UI, and hardware connection of the devices. However, the biggest risk right now is the waterproof enclosure. This task was more complicated than we anticipated, and we were facing issues in finding the correct materials and the location to 3D print/laser cutting them. Although the CAD is done, the actual printing has not started, and the timeline is still unpredictable, which might easily drag us into more delays. To mitigate this, we are trying to move people around to help with this task. Originally, Johnny was taking all the load on this part of the project. However, since Jason and Jack are making some good progress, they might be able to help Johnny with this part to ramp up the speed if needed.

In addition, the risks that were discussed in last week’s report remain valid.

Changes in Design

No design changes for this week.

Schedule Updates

Here is an image of our updated Gantt chart for this week, although the enclosure status may be inaccurate:

Gantt chart as of 3/30 (click for larger image)

Here are the schedule updates since last week:

Completed tasks :tada::

  • Initial UI complete

Delayed tasks :alarm_clock::

  • Radar implementation – We continue to slip on radar implementation. Note that we’ve changed the task from radar tuning to radar implementation, as at this point we’re just aiming to get the radars working (e.g., segmenting points into cars, determining lanes) instead of worrying about tuning. As of today, the RCW/BSM implementation is delayed until 3/31 (Sunday). The FCW implementation will most likely also be delayed but since we don’t have a plan yet for this it’s left on the schedule as is so its timeline is not accurate. One option is to run FCW implementation in parallel with electrical/software integration.
  • Turn signal development – Turn signal development is started but is still behind, so we’ve extended the dates to the coming Wednesday. This has resulted in downstream delays in all tasks, eating into our slack time. We now have 1 day of slack time left.
  • CAD design/prototyping –

Johnny Tian’s Status Report Mar 30th

Accomplishment

  1. Mandetary lab
  2. Button, Magnetometer test software.
  3. Button Case, Radar Case, Led Case Update
  4. 3d Printer Setup

Schedule 

  1. Finished Turn Switch, Radar, LED Casing Cad
  2. Started Prints
  3. Need to Finish display led 

Plan for Next Week

  1. Finalize all Cad
  2. Print out LED, Radar, Switch, Battery, Display casing and test fit

Documents

Jason Lu’s Status Report for 3/30

This Week’s Work

  • Completed initial implementation of main screen code (with faked sensor data)
  • Implemented button inputs for turn signal control

For details, please see the “State of the UI” section

Schedule

Out of the three deliverables from last week, I completed one (in green), started a task (in orange), and did not make any progress on the last task (in red).

  1. Complete initial implementation of main screen code (with faked sensor data)
  2. Complete implementation of turn signal control
  3. Complete radar tuning

For the turn signal controls, the remaining tasks are:

  1. Assign GPIO pins for the turn signal LED control transistors and add code to toggle the GPIO pins based on the turn signal state
  2. Implement the magnetometer. Unfortunately there aren’t any prebuilt JavaScript libraries for the magnetometer we’re using, so I’m writing the code to operate the magnetometer over I2C. A backup option is to use the Python script that Johnny wrote and add code to have it send the magnetometer over an IPC connection to the UI code where it will be parsed.

I am behind on the turn signal and radar tuning tasks. For the necessary schedule adjustments, please see the team posts. Most likely this week I will continue to prioritize the turn signal implementation work.

Upcoming Deliverables

The deliverables for this week are similar to last week’s, except with the removal of the UI implementation task and changing “radar tuning” to “radar implementation”.

  1. Complete implementation of turn signal control
  2. Complete radar implementation

State of the UI

Since the update last week, the UI has received the following major updates:

  • Implemented turn signal button injection from the debug screen – we can simulate turn signal button presses now
  • Implemented both forward and rear collision warnings
  • Added ability to simulate a radar not detecting any vehicles
  • Implemented physical turn signal button input handling – we can control the turn signals by pressing the real buttons now!
Screenshot of the forward collision warning being triggered – the system takes into account the distance and relative velocity to the vehicle in front.

There was quite an adventure with trying to get GPIO stuff to work with Node.JS. After trying a few different libraries, I settled on using the onoff library.

There are a couple of nuances with using this library:

  • This library doesn’t support setting the pullup/pulldown resistors for the GPIO pins – the README file in the onoff repository lists a few strategies such as modifying the device tree or adding commands to the Pi’s config.txt. However, apparently by default the Raspberry Pi’s GPIO pins that have numbers <= 8 are set to pullup by default. Therefore, I chose to use GPIO6 and GPIO7 so we don’t have to worry about the resistor settings. If we end up needing to change to a different GPIO pin, we can explore the options suggested.
  • GPIO numberings have changed on RPi 5 and older Raspberry Pi’s running kernels >= 6.6, instead of passing 7 as the GPIO number we pass 519 instead. Details are available here if you’re interested.

I’ve also started work on implementing the magnetometer (MMC5603). Since there’s no JavaScript library for this, I’m implementing the code using the datasheet and the i2c-bus library. Currently, I’m still working on the initialization code.

 

Jack Wang’s Status Report for 3/30/24

Personal Accomplishments:

  1. Radar and Integration (9 hrs): I continued working on radar and integration this week. While plotting the data points from the radar, I realized that it was plotting with delays. I suspected that was due to Pi’s performance and caused by plotting along, not the actual detection speed. I verified this by setting the radar to different baud rates. After raising the baud rate, the lagging during plotting still existed. However, when testing with just print statements, it seemed that we were getting data fast enough, prompting me to believe that this was just due to the Pi’s performance on plotting. I also kept working on radar tuning, processing the data the way that we wanted. I discussed with my team again on when to trigger the Blind Spot Warning or the Rare Collision Warning. We came down to a plan that we only trigger the RCW when there is an object directly behind the bike (i.e., the x-coordinate is +/- 3m from the bike). All objects in the other areas will trigger BSW based on their location. This was a bit simple compared to the “drawing box” method that we were thinking of last week. This should also reduce my workload on tuning the parameters. I do not have a visualizable demo for such progress yet, but I have been modifying the code to play around with the output of the radar. I also adjusted the wiring of the hardware setup, since now we have more devices connected to the Pi.
  2. Interim Demo Plans (3 hrs): I spent some time thinking about my part of the interim demo that is coming up next week. I will be demoing the radar, so I planned for a similar plotting setup like the one shown last week. However, I will add some flags to let the user see which kind of warning is triggered. I met with Jason and Johnny to discuss how to present our work during the demo. The focus will be on individual parts of the project for now, meaning that we will start integration after the demo.

Progress:

The radar tuning took longer than I expected last week, so I am a bit behind on that. However, the basic implementation will be done before Monday. I might need more time to do the tests. The integration should also start this coming week, so I am not terribly behind schedule. I am still able to manage my work given the current slack time.  We have also adjusted the plan (please see the group’s status report) so that the downstream task will not be affected by the delay.

Next Week:

  1. Continuation of radar tuning
  2. Radar & UI Integration

Johnny Tian’s Status Report Mar. 23rd

Accomplishment

  1. Ethics Related (4h)
  2. Testing radome (0.5h)
  3. Continue Update Design (7.5h)

Schedule 

  1. Finished Part of all Cad sections, some still needs to be fully complete
  2. Unable to start print finished parts

Plan for Next Week

  1. Start print finished parts
  2. Finish rest of the parts
  3. assemble printed parts and test fit.

 

Team Status Report for 3/23

Risk and Plans

The current risk that we encounter is still similar to what was discussed in last week’s report. Specifically, one of the risks that might jeopardize our progress would be running into issues that we did not expect. This week, when we were trying to tune the radar, we realized it was more complicated than originally thought to be able to detect cars that are traveling in different lanes behind the bike. Since the bike is constantly moving left and right, it would be very hard to use one distance cutoff to determine if a car is directly behind the bike or if it is left/right of the bike. We need to do more testing to find a proper metric. Similar to some issues we had last week, such unexpected issues might lead to lengthy delays. To mitigate this, we still have our built-in slack time in the project timeline, and we try to dynamically adjust the schedule and plan if needed. The adjustment is based on our MVP requirements. We also need to find a fine balance between the accuracy of the system and the time it takes to get it working. This will be motivated by our design requirements.

Changes in Design

  • We have settled on using Plexiglass as the radome material

Schedule Updates

Here is an image of our updated Gantt chart for this week, although the enclosure status may be inaccurate:

Gantt chart as of 3/23 (click for full screen)

Here are the schedule updates since last week:

Completed tasks :tada::

  • Turn signal schematics signed off
  • Enclosure sealants have arrived

On track tasks :white_check_mark::

  • UI software development

Delayed tasks :alarm_clock::

  • Radar tuning – Due to delays in the radar tuning, we’ve punted the dates for all tasks to be from 3/23 – 3/29. This is a significant risk as further delays here could potentially cause downstream delays, and this is a complicated task.
  • Turn signal development – We have not started on turn signal software development yet, so this has also been punted to 3/23 – 3/29.

As of now, this has not resulted in overall delays, but any further delays in any of those tasks beyond 2 days will cause downstream delays so we need to be careful not to incur any more.

  • CAD design/prototyping – The initial CAD design has not yet been completed, but prototyping will begin on Monday which is later than intended but still within the assigned time slot.

Jason Lu’s Status Report for 3/23

This Week’s Work

  • Ethics assignment related tasks
  • Continued UI development (see “State of the UI” section below)

Schedule

I was unable to complete any tasks from this week:

  1. Complete initial implementation of main screen code (with faked sensor data)
  2. Start integration with Jack’s radar code
  3. Start implementation of turn signal control
  4. Start radar tuning for forward collision warning

I’m close to completing the UI, the remaining tasks on my plate are (at least what’s needed for integration):

  • Add UI elements for rear and forward collision warnings
  • Finish up code to simulate turn signal button presses (almost done at time of writing)
  • Allow overriding a radar zone’s data to show no car detected, currently we can only control the range and velocity of the car detected in the zone but not whether a car is present or not.

With respect to the UI itself, I am currently on track although there is a risk of slippage depending on how long these remaining tasks take. According to the schedule, I have until tomorrow to complete the visualizations, so there isn’t much time left.

Because of the delay in starting the other tasks, I’ve shifted the dates for the radar tuning (forward collision tuning) and turn signal implementation to 3/23 – 3/29. This has not resulted in any delays, but further delays (which is a high risk!) in these tasks runs the risk of delaying the overall electrical/software integration such that it results in a cascade of delays downstream. I’m hoping that I can make good progress on these during the lab times this week.

Upcoming Deliverables

The main deliverables for this week are similar to last week except for removing integration with Jack’s radar code since that task doesn’t start until 3/30, and changing “start” to “complete” for turn signal implementation and radar tuning since their deadlines are soon:

  1. Complete initial implementation of main screen code (with faked sensor data)
  2. Complete implementation of turn signal control
  3. Complete radar tuning

State of the UI

Please see this video I recorded for the current state of the UI: Google Drive link (you will need a CMU account to see this).

Jack Wang’s Status Report for 3/23/24

Personal Accomplishments:

  1. Ethics Lecture & Activities (3 hrs): During lecture time on Monday, I attended the ethics session and participated in the in-class activities. We analyzed some of the worst-case scenarios for our projects and discussed them with other teams.
  2. Radar Work (9 hrs): The primary work this week was on the radar tuning. After the software bring-up was done last week, my goal was to figure out how to tune the radar parameters and make them useful to our needs this week. I started by plotting the radar output using some codes provided by RFBeam. Here is an example output:

There are a lot of parameters that can be tuned for the radar to achieve different detection outcomes. I researched the ways to tune the parameters based on the datasheet. There is a Python driver that contains some helper functions to tune these parameters, but I think we do not need to rely on that as the tuning should be very simple based on the datasheet. Another thing that I was exploring this week was how to process the radar targets. I did some brief tests on Wednesday to figure out how the radar is outputting the target. In the raw target mode, it would show the target in reference to the local x-y locations. Since the radar is mounted toward the back of the bike, the y-axis indicates how far the car is from the bike and the x-axis shows if the car is left or right on the bike. This work will be continued next week.

Progress:

I am currently somewhat on track with the radar tuning, although more work might need to be done, so it might go over the time that we originally budgeted. There were some delays to get the tuning started as it was complicated. However, I am on track after the adjusted schedule. Please see the team status report for more details.

Next Week:

  1. Continuation of radar tuning
  2. Preparing for the interim demo.