Nonlinear Kalman Filtering With Divergence Minimization

 Authors San Gultekin, John W. Paisley Journal/Conference Name IEEE Transactions on Signal Processing Paper Category Signal Processing Paper Abstract We consider the nonlinear Kalman filtering problem using Kullback–Leibler (KL) and $\alpha$-divergence measures as optimization criteria. Unlike linear Kalman filters, nonlinear Kalman filters do not have closed form Gaussian posteriors because of a lack of conjugacy due to the nonlinearity in the likelihood. In this paper, we propose novel algorithms to approximate this posterior by optimizing the forward and reverse forms of the KL divergence, as well as the $\alpha$-divergence that contains these two as limiting cases. Unlike previous approaches, our algorithms do not make approximations to the divergences being optimized, but use Monte Carlo techniques to derive unbiased algorithms for direct optimization. We assess performance on radar and sensor tracking, and options pricing, showing general improvement over the extended, unscented, and ensemble Kalman filters, as well as competitive performance with particle filtering. Date of publication 2017 Code Programming Language Matlab Comment