AAAI 2017: The kernel Kalman rule: efficient nonparametric inference with recursive least squares

Nonparametric inference techniques provide promising tools for probabilistic reasoning in high-dimensional nonlinear systems. Most of these techniques embed distributions into reproducing kernel Hilbert spaces (RKHS) and rely on the kernel Bayes’ rule (KBR) to manipulate the embeddings. However, the computational demands of the KBR scale poorly with the number of samples and the KBR often suffers from numerical instabilities. In this paper, we present the kernel Kalman rule (KKR) as an alternative to the KBR. The derivation of the KKR is based on recursive least squares, inspired by the derivation of the Kalman innovation update. We apply the KKR to filtering tasks where we use RKHS embeddings to represent the belief state, resulting in the kernel Kalman filter (KKF). We show on a nonlinear state estimation task with high dimensional observations that our approach provides a significantly improved estimation accuracy while the computational demands are significantly decreased.

  • G. H. W. Gebhardt, A. Kupcsik, and G. Neumann, “The kernel Kalman rule: efficient nonparametric inference with recursive least squares,” in Thirty-First AAAI Conference on Artificial Intelligence, 2017.
    [BibTeX] [Abstract] [Download PDF]

    Nonparametric inference techniques provide promising tools for probabilistic reasoning in high-dimensional nonlinear systems. Most of these techniques embed distributions into reproducing kernel Hilbert spaces (RKHS) and rely on the kernel Bayes? rule (KBR) to manipulate the embeddings. However, the computational demands of the KBR scale poorly with the number of samples and the KBR often suffers from numerical instabilities. In this paper, we present the kernel Kalman rule (KKR) as an alternative to the KBR. The derivation of the KKR is based on recursive least squares, inspired by the derivation of the Kalman innovation update. We apply the KKR to filtering tasks where we use RKHS embeddings to represent the belief state, resulting in the kernel Kalman filter (KKF). We show on a nonlinear state estimation task with high dimensional observations that our approach provides a significantly improved estimation accuracy while the computational demands are significantly decreased.

    @inproceedings{lirolem26739,
    author = {G. H. W. Gebhardt and A. Kupcsik and G. Neumann},
    publisher = {AAAI},
    month = {February},
    title = {The kernel Kalman rule: efficient nonparametric inference with recursive least squares},
    booktitle = {Thirty-First AAAI Conference on Artificial Intelligence},
    year = {2017},
    url = {http://eprints.lincoln.ac.uk/26739/},
    abstract = {Nonparametric inference techniques provide promising tools
    for probabilistic reasoning in high-dimensional nonlinear systems.
    Most of these techniques embed distributions into reproducing
    kernel Hilbert spaces (RKHS) and rely on the kernel
    Bayes? rule (KBR) to manipulate the embeddings. However,
    the computational demands of the KBR scale poorly
    with the number of samples and the KBR often suffers from
    numerical instabilities. In this paper, we present the kernel
    Kalman rule (KKR) as an alternative to the KBR. The derivation
    of the KKR is based on recursive least squares, inspired
    by the derivation of the Kalman innovation update. We apply
    the KKR to filtering tasks where we use RKHS embeddings
    to represent the belief state, resulting in the kernel Kalman filter
    (KKF). We show on a nonlinear state estimation task with
    high dimensional observations that our approach provides a
    significantly improved estimation accuracy while the computational
    demands are significantly decreased.},
    keywords = {ARRAY(0x56147f374af8)}
    }