Sunday, January 2, 2011

Project Session 4

Date:
02/01 2011

Duration of activity:
5 hours

Group members participating:
Christian & Lasse

Goals:
Get an understanding of the possibilites for more advanced sensor fusion [5],  and implement a suitable solution.

Plan:
1. Use the final code from last session, and try to improve the result using calibration and tuning of system parameters.
2. Investigate the possibilities for using an acknowledged filter for sensor fusion.
3. Implement a new balancing routine using a complementary filter.

Results:

1. Using calibration and tuning
Our efforts from last session showed that resetting the gyro angle based on an accelerometer reading didn't improve the balancing much. The robot (and accelerometer) was too unsteady for the gyro angle to be reset often enough. We will give the strategy a last try by calibrating the offset at startup, tuning the PID values, and then hopefully give the accelerometer better conditions for resetting the angle.

To make the robot move more smoothly we removed the normalization-part when regulating the motors. This change has the benefit that it is easier for the robot to make small angle adjustments instead of a more oscillating behaviour when using normalization. Furthermore it is not possible for the robot to stand still when using normalization.

We did some PID experiments by tuning the values manually. It was hard too observe any improvements because the gyro drift still influenced the resulting behaviour. Video 1 shows an example of a simple tuning where P equals to 35, I and D are both set to 0. Here it is obvious that the P-value is set too high, but the only way to set the right value is by experimenting. After a bit of tuning we concluded that a P-value around 30 and I and D values under 4 seemed to make the robot fairly steady, but unfortunately not steady enough for accelerometer to reset the gyro angle frequently enough. So we gave up the idea of using this method for sensor fusion.



Video 1 - Robot with PID values: P=35, I=0, D=0

2. Filters for sensor fusion
We have found a great explanation on using an accelerometer and gyro for balancing using filters [1]. The source included a tutorial on filters for balancing plus C source code which motivated us for using filters because we (as IT engineers) have a minimum knowledge about digital filters. Below is a brief description of the most obvious solutions for using accelerometer and gyro for balancing. For further explanation see the source mentioned above.

2.1 Accelerometer-assisted gyro

This was our self-invented method and the first attempt to combine the accelerometer and the gyro which is sketched on Figure 1. 

Figure 1 - The concept behind the first attempt to combine accelerometer and gyro.

Here the gyro both provides the angle and angular velocity. The angle is derived from integration, and is simply calculated like  
angle += gyro_velocity * time_since_last_sample

Which is similar to what simple physics say:
distance travelled = speed * time taken.
- in which calculated angle corresponds to the distance travelled.

As explained earlier the accelerometer sets the angle to 0 when it detects that the robot is upright to eliminate the gyro from drifting.

2.2 Complementary filter
The complementary filter aims to eliminate both the noise on the accelerometer and the drift in the gyro. This is done by using a low-pass filter on the accelerometer and thereby eliminating short-term fluctations. Gyro drift is cancelled out by using a high-pass filter as shown on Figure 2. 


Figure 2 - An overview of the complementary filter used for balancing

The idea is to use both the accelerometer and gyro for the angle calculation (combined with integration) and to be able to set how much we trust (weight) in the gyro and the accelerometer like:


Here the complementary part is illustrated by 0.98 and 0.02 which adds up to 1. The larger the value in front of the gyro term is (the high-pass filter), the more drift is allowed, and less for the accelerometer derived angle (the low pass filter).

2.3 Kalman filter
Figure 3 shows a sketch of the third and most advanced filter - the Kalman filter. As mentioned in session 3 we haven't studied the filter in depth because of its complexity, and as a consequence of the time consumption needed for understanding the filter sufficiently.


Figure 3 - Overall view of the concept for fusing accelerometer and gyro with the use of the Kalman filter

The overall concept is to use measurements that are observed over time that contain noise and other inaccuracies, and produce values that tend to be closer to the true values of the measurements and their associated calculated values. This is done by using an internal state model as shown in context on Figure 3 and more detailed on Figure 4.

Figure 4 - Model underlying the Kalman filter. Squares represent matrices. Ellipses represent multivariate normal distributions (with the mean and covariance matrix enclosed). Unenclosed values are vectors (image and text from Wikipedia). 

2.4 Our choice
After studying the possibilities we ended up choosing the complementary filter because it seemed like a reasonable compromise between our failed self-invented method and the Kalman filter. The complementary  filter is said to have nearly the same performance as the Kalman filter but should be less processor intensive [1]. We then decided to try out the complementary filter and if the performance shows to be unsatisfactory, the Kalman filter will be our last resort. 

3. Using the complementary filter
In order to derive the complementary angle we needed to calculate the angle using the accelerometer. This was done by using trigonometry which included measurements from two axes. Apart from that, variables for converting from radians to degrees and to compensate for the accelerometer offset were also needed - as shown below from the class Accelerometer [4]. The angle_offset_scale was needed because there was a small offset in the angle calculation. This value was found during experiments.

To actually join the two sensors in code we created a class named CumulativeSensor [3] with the responsibility of controlling the sampling of the gyro and accelerometer sensors, and making the complementary angle publicly accessible:




After having the structure in place we decided to remove unnecessary code - e.g. writing to the robot LCD. We discovered that writing to the LCD panel consumes quite a lot of processor time; by reducing the number of writings from 5 to 2 the duration of the control loop was reduced from approximately 35 ms to approximately 25 ms.
The Bluetooth thread, however, didn't have any measurable influence on execution time.

Afterwards we added the values accelerometer angle, gyro velocity and overall filtered angle to the PC application in order to monitor the variables. We noticed that the overall angle calculation was very slow to catch up with change in accelerometer angle. Figure 5 shows the result from the calculated accelerometer angle, the gyro velocity and the overall filtered angle. 

Figure 5 - Shows the monitoring of the three values  gyro velocity (yellow), calculated angle from accelerometer (red), and the complementary angle from CumulativeSensor (green)

Consequently one of the goals for the next session will be to improve the response time because, as it is now, the response time is way too slow to make the robot capable of balancing. We are quite sure that the lag isn't caused by limitations in the CPU-speed of the NXT but instead a correctable error in the filter. Apart from the slow response time it seems like the accelerometer noise and the gyro drift are less noticeable, so we are on the right track. 

Conclusion:
This session has been very rewarding; we have dropped the self-invented method for sensor fusion and found a method that seems very promising. There are still some hurdles to overcome but we are quite sure that it is possible to reach our goal using the chosen complementary filter. Furthermore experiments need to be done to reach an acceptable balance between gyro drift and accelerometer noise.

The code for this session is found here: NXT [6] and PC [7].

References:

[1] The DIY Segway - http://web.mit.edu/first/segway/segspecs.zip, filter.pdf, MIT, 2008
[2] Kalman Filter - http://en.wikipedia.org/wiki/Kalman_filter,  Wikipedia
[5] Fred G. Martin, Robotic Explorations: A Hands-on Introduction to Engineering, Chapter 5, p. 207, Prentice Hall, 2001
[6]  L., Rasmussen, F., Laulund & C., Jensen, Session 4 source code on NXT, http://dl.dropbox.com/u/2389829/Lego/Session%204/Session%204%20NXT%20source%20code.zip
[7]  L., Rasmussen, F., Laulund & C., Jensen, Session 4 source code on PC, http://dl.dropbox.com/u/2389829/Lego/Session%204/Session%204%20PC%20source%20code.zip