Weekly Update 11/19 – 11/25

What we did as a group:

  • We all had reduced workload because of thanksgiving. Work was mostly done remotely. We continued working on the algorithm and the recording. We did feasibility research for a new features we were thinking of.

Hubert:

  • Feasibility research on animation for instructing drivers to get closer or further from the right
    • This will be during the stage where we’re trying to get to the position that’s parallel with the reference car in front of the parking spot.

Zilei:

  • Progress on the algorithm
    • Still using text overlay for now, pending integration with the audio in the upcoming week
    • Redesign of different sub-states within a state
      • currently a mess in code, but pretty comprehensive of the different scenarios

Yuqing:

  • Recording of the audio files
  • Playing audio files from the android app

 

Weekly Update 11/12 – 11/18

What we worked on as a group

  • Code implementation for parking algorithm
  • Preliminary testing for state detection accuracy

  • Preliminary draft for instruction generation
    • Verbal instruction
    • Plot of desired position on our obstacle outline

 

Hubert

  • Cont’d work on headless start
    • emulator doesn’t have cellular and cannot create mobile hot spot
  • Plotting the desired position for the initial parallel position

  • Adding code infrastructures for debugging and plotting destination car positions.

Zilei

  • Set up code structure for parking algorithm
  • State detection & handling different cases
    • Required environment: existence of a reference vehicle,
    • display debug statements to textfield
    • Idle state
      • idle, waiting for button to start
      • could potentially add beeping when vehicle gets too close to surrounding
    • Search state
      • the most complicated of all
      • Will rely on readings from
        • a designated parking sensor on the rear right end of the vehicle to determine whether the car is parallel with the reference car in wrt the y-axis
        • all right sensors & parking sensor for when the distance is too close, in range, too far, or the reference car completely out of sight
          • angle with respect to the reference car
    • Reverse right state
    • Reverse left state

Yuqing

  • Restart & end algorithm buttons
    • change text
    • communicate with the algorithm class

  • Researched on packages available for playing verbal instruction
  • Integrated one sample sound into the App

Weekly Update 11/04-11/11

What we did as a team:

  • Integrated all parts of the system and demoed the first version of the app (without parking instruction) on an Android device
  • Experimentally determined the exact points of references when generating parallel parking instruction
    • We assigned parking lot on an empty space
    • Yuqing drove the testing vehicle while Zilei and Hubert experimented with giving different instructions at different reference points
    • Iterated the process with different reference points and different instructions and decided on the first version of our reference points
    • Testing setup
    • What we found out:
      • The parking spot needs to be at least of dimension 3m x 1.2m
      • The exact point at which the driver needs to turn the steering wheel all the way to the right
      • The exact point at which the driver needs to turn the steering wheel all the way to the left
      • The systematic approach specified above ensures mostly successful parallel parking into the allocated parking spot

Hubert

  • Updated ultrasonic sensors driver to minimize the delay of outline plotting
  • Refined Gyroscope driver to ensure handling of errors occurred during accidental disconnection

Zilei

  • Drafted a flow chart for Parking Instruction Generation algorithm

  • Helped experimentally determined reference points for the testing vehicle in real life
    • Removal of the “straight reverse” stage
    • Determined the approximate size of the spot we need
    • When to steer left from full right

Yuqing

  • Refined path plotting algorithm to reflect more realistic projection of car path by measuring the curvature offset of the car path in the video image
  • Fixed multiple bugs in the current plotting algorithm

Weekly Update 10/29 – 11/03

What we did as a group

  • Reconnected jumper wires
  • Changed our current network protocol, so that no socket is reused and RPI can handle the video streaming between socket read/writes. This solved our previous problem not being able to stream & draw
  • Changed ultrasonic sensor pulling rate so that (1) no outdated data is being used to draw contour,  (2) delay for each reading is less. We improved our reading delay significantly.
  • Progress was made toward path prediction shown on the outline graph and path prediction overlay on the streamed video. Testing still needs to done next week to fine tune those.
  • Prepare for midpoint demo: update our schedule & distribute work for the next 5 weeks

Hubert

  • Rearranged socket code so that RPI can execute stream-related code when the outline drawing isn’t requesting data
  • Worked on Arduino code for ultrasonic sensor sampling algorithm
    • Changed continuous write to pull based write
    • Changed median method to sample fewer values for lower delay without noticeable decrease in accuracy
    • Worked around serial race condition to decrease delay

Yuqing

  • Debug socket connection & analyze the problem
  • Worked on front end and aesthetics
    • Buttons
    • Shaded area
  • Worked on path prediction overlay & finish path prediction graph on the outline

Zilei

  • Debug socket connection & analyze the problem
  • Debug Arduino readings after it was changed from continuous write to pull-based
  • Prepare for demo & revised Gantt Chart