Nonlinear Kalman Filtering With Divergence Minimization

View Researcher's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Please contact us in case of a broken link from here

Authors San Gultekin, John W. Paisley
Journal/Conference Name IEEE Transactions on Signal Processing
Paper Category
Paper Abstract We consider the nonlinear Kalman filtering problem using KullbackÔÇôLeibler (KL) and <inline-formula> <tex-math notation="LaTeX">$\alpha$</tex-math></inline-formula>-divergence measures as optimization criteria. Unlike linear Kalman filters, nonlinear Kalman filters do not have closed form Gaussian posteriors because of a lack of conjugacy due to the nonlinearity in the likelihood. In this paper, we propose novel algorithms to approximate this posterior by optimizing the forward and reverse forms of the KL divergence, as well as the <inline-formula> <tex-math notation="LaTeX">$\alpha$</tex-math></inline-formula>-divergence that contains these two as limiting cases. Unlike previous approaches, our algorithms do not make approximations to the divergences being optimized, but use Monte Carlo techniques to derive unbiased algorithms for direct optimization. We assess performance on radar and sensor tracking, and options pricing, showing general improvement over the extended, unscented, and ensemble Kalman filters, as well as competitive performance with particle filtering.
Date of publication 2017
Code Programming Language Matlab

Copyright Researcher 2022