Team report
This week, Elias worked mainly on the detecting part. He finished refining the tag detection algorithm, and currently the accuracy of detecting the tag from a still image frame is above 95%. On the other hand, with probabilistic model to predict the location of the tag in the next frame, the longest time that our algorithm can continuously detect the tag in moving frames is more than 20 seconds now.
After spring break, Elias will try to incorporate Matthew-Baker tracking method to help increase the tracking accuracy.
On the other hand, Zhiqi has already setting up the PID controller in matlab. The basic code and visualization tools has been well-developed in matlab. He wishes to tune the parameter with real world numbers in the next few weeks. Our controller is to control the speed of the motor. The data is needed in our use case. Based on that, we need to get input from the motor in the real world so that we can tune our system better. After we get the speed control, we also need to take a look at the turn and the brake in this case.
Zhiqi also finished setting up the IMU sensor. Next he will need to work on the calibration of the sensor with the suitcase. This requires installation and integration of the sensors.
Right now we are still following the schedule. We also hope to keep up with the schedule after spring break.
Zhiqi Report
Recall from last week, I have been designing the PID controller. This week, I continued to research on the PID controller and implementation of the controller. Recall that in the last week we have defined the term PID which stands for proportional, integral, derivative.
They are the three components of the controller as well as the code. These three components will be calculated as a function of error given by the sensor located in the motor. For example, F_I_e = FI(e) is the integral control function which is a function of the error. And error in this case, we defined it as the difference between the real value and the set point(ideal value). This value is caused by disturbance. The full equation could be elapsed as follows:
Matlab provides a great library for control theory and application. My main implementation will also be in matlab, which is easy for visualization and tuning.
Kp = ;
Ki = ;
Kd = ;
C = pid(Kp,Ki,Kd);
sys_cl = feedback(C*P_motor,1);
By giving value to Kp, Ki and Kd, we can give value to each of the parameter of the three components. And pid(Kp, Ki and Kd) will give us the final results of the output of the PID. and feedback function put the error back into the input. We can also use the step(sys_cl, range) to visualize the step response of the controller. Recall from last week, increasing the proportional gain Kp will reduce the steady-state error. However, also recall that increasing Kp often results in increased overshoot. And the large Ki will greatly increase the overshoot.
This requires several tuning of the parmeter so that we could find a perfect set of parameters that works well for our use case.
Next, I am going to research on the parameter tuning on the real motors by setting up the motors and connecting the motors to the computer.
Elias’s report
This week, I finished the tag detection part of the project. At first, I tried to use OpenCV apriltags module to help locate the tag in an image frame, but it turned out that the accuracy is not high enough to meet our design specification requirement. Therefore, I searched for other method to detect Apriltags, and I found and tried the edge/contour detection. Firstly I detected all rectangular contours in the frame, and match the results with Apriltag detection. This highly increased the accuracy, and I passed 40 images with different Apriltags to the algorithm, and the number of correct detection is 38, so we finally met our design requirement.
However, another problem striked. If I tried to input a stream of images, which is a video stream, the accuracy becomes lower, since when the tag is moving with some speed, not all images shot with the camera can highly recover and represent the whole tag – some are cutted off by a little and some are blurred due to movement. Therefore, contour/edge detection might fail here, so I improved the algorithm by adding a probabilistic model to guess where the tag could be in the next image frame, and I added more tolerant towards the lag of a good detection. For example, if there is no good detection in the next two frames, I will keep updating the guessed area of where the tag could be in, and when the next frame that has a detection, I will match the detection with the probabilistic model’s guess to find out if it is actually the Apriltag that we are following.
There is another thing that I can work on. Currently, the detection and tracking work fine with only one user in the frame, but what if multiple users are detected by one of the robo suitcase? I could also add the tag number to see if there are multiple tags, which one is the one that our user has. If we could actually merchandise this project, we may design our own set of tags, since there may also be apriltags with same coding in the background.
After spring break, I will work on adding Matthew-Baker alignment tracking method to help get better tracking accuracy, and we could then start path-planning.
0 Comments