A Kalman Filter for Robust Outlier Detection
Jo-Anne Ting, Evangelos Theodorou, Stefan Schaal Computational Learning & Motor Control Lab University of Southern California
IROS 2007 October 31, 2007
Outline • Motivation • Quick review of the Kalman filter • Robust Kalman filtering with Bayesian weights • Experimental results • Conclusions
IROS 2007
2
Motivation • Consider real-time applications where storing data samples may not be a viable option due to high frequency of sensory data • In systems where high quality sensory data is needed, reliable detection of outliers is essential for optimal performance (e.g. legged locomotion):
• The Kalman filter (Kalman, ’60) is commonly used for real-time tracking, but it is not robust to outliers! IROS 2007
3
Previous Methods Robust Kalman filter approach
Drawback
1)
Use non-Gaussian distributions for random variables (Sorenson & Alspach ’71, West ’82)
Complicated resulting parameter estimation for systems with transient disturbances
2)
Model observation & state noise as non-Gaussian, heavy-tailed distributions (Masreliez ’75)
Difficult & involved filter implementation
3)
Use resampling or numerical integration (Kitagawa ’87)
Heavy computation not suitable for real-time applications
4)
Use a robust least squares approach & model weights with heuristic functions (e.g., Durovic & Kovacevic, ’99)
Need to determine the optimal values of open parameters
IROS 2007
4
Outline • Motivation • Quick review of the Kalman filter • Robust Kalman filtering with Bayesian weights • Experimental results • Conclusions
IROS 2007
5
A Quick Review of the Kalman Filter • The system equations for the Kalman filter are as follows:
Observation matrix
z k = C! k + v k ! k = A! k"1 + s k State transition matrix
IROS 2007
Observation noise:
v k ~ Normal ( 0, R )
State noise:
s k ~ Normal ( 0,Q)
6
Standard Kalman Filter Equations Propagation:
! k' = A ! k"1 # = A# k"1A + Q Update: ' k
T
(
S = C# C + R ' k
' k
T
)
"1
Can use ML framework to estimate system dynamics (Myers & Tapley, 1976)
K k' = # 'k CT S'k
! k = ! k' + K k' ( z k " C! k' )
(
)
# k = I " K k' C # 'k
IROS 2007
7
Outline • Motivation • Quick review of the Kalman filter • Robust Kalman filtering with Bayesian weights • Experimental results • Conclusions
IROS 2007
8
Robust Kalman Filtering with Bayesian Weights • Use a weighted least squares approach & learn the optimal weights:
z k ! k , wk ~ Normal ( C! k , R / wk )
! k ! k"1 ~ Normal ( A! k"1 ,Q)
(
wk ~ Gamma awk ,bwk Kalman filter
! k"1
!k
Robust Kalman filter with Bayesian weights
!k
! k"1 Ak!1
zk!1
zk
zk!1
Qk!1 wk!1
Ck!1 Rk!1 IROS 2007
)
Ak
Qk
wk
zk
Ck
Rk 9
Inference Procedure • We can treat this as an EM learning problem (Dempster & Laird, ’77): N
Maximize log " p (!1:k , z i , w1:k ) i=1
• We use a variational factorial approximation of the true posterior distribution: N
N
i=1
i=1
Q ( w,! ) = " Q ( wi )" Q (! i ! i#1 )Q (! 0 )
to get analytically tractable inference (e.g., Ghahramani & Beal, ’00).
IROS 2007
10
Robust Kalman Filter Equations Propagation:
Propagation:
! k' = A k ! k"1
! k' = A ! k"1
# 'k = Q k Update: $ R ' S'k = & C k # 'k CTk + k ) wk ( %
"1
Compare to standard Kalman filter
K k' = # 'k CTk S'k
! k = ! k' + K k' ( z k " C k! k' )
(
)
# k = I " K Ck # wk =
IROS 2007
' k
' k
# 'k = A# k"1A T + Q Update:
(
S'k = C# 'k CT + R
)
"1
K k' = # 'k CT S'k
! k = ! k' + K k' ( z k " C! k' )
(
)
# k = I " K k' C # 'k
1 2 T + ( z k ! C k" k ) R !1 k ( z k ! C k" k ) awk 0 +
bwk 0
11
Important Things to Note • Our robust Kalman filter: 1) Has the same computational complexity as the standard Kalman filter
2) Is principled & easy to implement (no heuristics) 3) Offers a natural framework to incorporate prior knowledge of the presence of outliers
IROS 2007
12
Outline • Motivation • Quick review of the Kalman filter • Robust Kalman filtering with Bayesian weights • Experimental results • Conclusions
IROS 2007
13
Real-time Outlier Detection on LittleDog Outliers
Our robust KF performs as well as a handtuned KF (that required prior knowledge and, hence, is near-optimal)
IROS 2007
14
Outline • Motivation • Quick review of the Kalman filter • Robust Kalman filtering with Bayesian weights • Experimental results • Conclusions
IROS 2007
15
Conclusions • We have introduced an outlier-robust Kalman filter that: 1) Is principled & easy to implement 2) Has the same computational complexity as the Kalman filter 3) Provides a natural framework to incorporate prior knowledge of noise
• This framework can be extended to other more complex, nonlinear filters & methods in order to incorporate automatic outlier detection abilities.
IROS 2007
16
Final Posterior EM Update Equations M-step:
E-step:
(
"1 k
! k = wk C R C k + Q T k
)
"1 "1 k
T "1 # k = ! k ( Q"1 k A k # k"1 + wk C k R k z k )
wk =
1 2 T + ( z k " C k# k ) R "1 k ( z k " C k# k ) awk 0 +
bwk 0
Need to be written in incremental form
# k C k = % " wi z i ! i $ i=1
k T T & w ! ! (' %$ " i i i (' i=1
k # k & T T A k = % " ! i ! i)1 ( % " ! i)1! i)1 (' $ i=1 ' $ i=1
1 k rkm = " wi k i=1
( z i ) C k (m,:)!i )2
1 k qkn = " wi k i=1
(!i ) A k (n,:)!i)1 )2
)1
These are computed once for each time step k (e.g., Ghahramani & Hinton, 1996) IROS 2007
17
)1
Incremental Version of M-step Equations • Gather sufficient statistics to re-write M-step equations in incremental form (i.e., only using values observed or calculated in the current time step, k): M-step: Ck = " k
wz! T
Ak = " k
!! T
(" ) w!! T
#1
k
(" ) ! '! '
#1
k
{
}
wz! w!! T 1 $ wzz rkm = " km # 2C k (m,:)" km + diag C k (m,:)" k C k (m,:)T ' )( k &% !! ' ! '! ' 1 $ !2 qkn = " kn # 2A k (n,:)" kn + diag A k (n,:)" k A k (n,:)T ' () k %&
{
IROS 2007
}
18