Motivation & Goals

I wanted to learn about Kalman filtering and in the process of doing so, I have tried to learn from several resources.

1st attempt

The highly recommende article

An Introduction to the Kalman Filter; authors: Greg Welch, Gary Bishop

did not work for me due to missing background information.


2’nd  attempt

I then found the higly readable article

An Elementary Introduction to Kalman Filtering`, authors: Yan Pei, Donald S. Fussel, Swarnendu Biswas, Keshav Pingali

I tried to understand most parts of the article and reproduced some content from this article in a number of Ipython notebooks:

kalman_introduction_part1.ipynb

kalman_introduction_part2.ipynb

kalman_introduction_part3.ipynb

The article covers topics like:

  • weighted addition of random variable; mean and variance of these additions.
  • random vectors, covariance matrix, weighted addition of random vectors with weighting factors as matrices, covariance matrix of weighted addition
  • state update or state evolution equation and covariance of the state update vector as a function of the covariance matrix of the estimated state vector.
  • review of matrix derivatives

But I got stuck in final part of that article…


3’rd attempt

Here I tried to understand the article:

A Kalman Filtering Tutorial for Undergraduate Students ; authors: Matthew B. Rhudy, Roger A. Salguero, Keaton Holappa; International Journal of Computer Science & Engineering Survey Vol.8, No.1 February 2017

and reproduced some content from this article in a Ipython notebook:

kalman_simulation.ipynb

the notebook only covers part of the material presented in the article. The derivation of Kalman gain is missing .

The notebook reproduces a simulation from the article.


4’th attempt

I found the on-line book

Kalman and Bayesian Filters in Python author: Roger R Labbe Jr

and worked through the first chapters to learn about Kalman-Filtering.

I read about the g-h or alpha-beta filter and prepared a Ipyhon notebook.

tracking_filter.ipynb

Although the book is very readable it also very light  on math.

So I looked for another resource which covers the derivation of the Kalman filter while not leaving out the underlying mathematical aspects (linear algebra, statistics, estimation theory, linearisation).


5’th attempt

I recently found a very readable account on the Kalman filter.

https://www.kalmanfilter.net

and bought the book

Kalman Filter from Ground Up; author Alex Becker. The book covers topics that go beyond of what is available on the web-site.

It seems that the book does not omit the underlying mathematical concepts as many other resources do. I started from chapter 3 and documented my learning progress with a series of Jupyter notebooks. For each chapter there are one or more notebooks (also available on GitHub).

Note

For the most important notebooks the GitHub repository provides PDF files for better readability. However I found out, that the conversion utilities of the Jupyter notebook environment are unable to convert some LaTeX formulas and markdown tables into a visual appealing PDF document. The issue exists not only in my programming environment on Windows but as well on my Linux PC.

chapter 3

kalmanfilter_chapter_3.ipynb

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_chapter_3.pdf

The notebooks covers these topics:

  • recursive averaging of measured value
  • alpha-beta or g-h filter
  • numerical examples

chapter 4

kalmanfilter_chapter_4.ipynb

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_chapter_4.pdf

The notebooks covers these topics:

  • one-dimensional Kalman filter
  • recursive state estimation (one dimensional)
  • minimising the variance

chapter 5

kalmanfilter_chapter_5.ipynb

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_chapter_5.pdf

The notebooks covers these topics:

  • how process noise affects the estimation process
  • some numerical experiments

chapter 6

kalmanfilter_chapter_6_2.ipynb

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_chapter_6_2.pdf

The notebooks covers these topics:

  • covariance
  • numerical example of uncorrelated and correlated data
  • multivariate normal distribution

A separate notebook multivariate_gaussian_1.ipynb deals with some details of the properties of the multivariate normal distribution.

chapter 7

The content of chapter 7 of the book Kalman Filter from Ground Up is explored by a series of notebook (instead of putting everything into a single notebook)

kalmanfilter_chapter_7_covariance.ipynb

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_chapter_7_covariance.pdf

with a detailed discussion of the covariance matrix in 2D and a review of properties of the multivariate normal distribution

kalmanfilter_chapter_7_1.ipynb

kalmanfilter_chapter_7_2.ipynb

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_chapter_7_1.pdf

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_chapter_7_2.pdf

The Kalman filter is extended to multiple dimensions and some mathematics of expectations / variances are reviewed.

chapter 8

kalmanfilter_chapter_8.ipynb deals with the equations for the Kalman filtering process:

  • state extraplation equation
  • covariance extrapolation equation
  • process noise
  • measurement equation
  • state update equation with Kalman gain as a weighting factor

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_chapter_8.pdf

chapter 9

kalmanfilter_chapter_9_example1.ipynb has a numerical example of a vehicle moving in 2 dimensions with external forces:

  • motion equations -> state extrapolation equation
  • covariance extrapolation equation
  • process noise
  • measurement equation
  • state update equation

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_chapter_9_example1.pdf

chapter 10 & 11

kalmanfilter_chapter_11.ipynb provides some mathematical background:

  • square root of a matrix
  • Cholesky decomposition of a symmetric positive definite matrix
  • numerical example using

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_chapter_11.pdf

chapter 12

kalmanfilter_chapter_12.ipynb deals with linear system vs. non-linear systems and nonlinear measurements.

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_chapter_12.pdf

chapter 13

In preparation to chapter 13 (extended Kalman filter) I had to review some math about linearisation of one dimensional and multi dimensional function. While the book provides a good overview I found it insufficient to get a more thorough knowledge.

kalmanfilter_linearisation_methods.ipynb covers the mathematics of linearisation techniques. It makes heavy use of these resources:

https://ocw.mit.edu/courses/18-02-multivariable-calculus-fall-2007/pages/readings/supp_notes

https://www.research-collection.ethz.ch/bitstream/handle/20.500.11850/82620/eth-8432-01.pdf

These two articles help me a lot to understand:

  • linear approximation in one dimension
  • linear approximation in two dimensions
  • linear approximation in > 2 dimensions; numerical example of 3D linear approximation
  • propagation of uncertainty (how mean and covariance of the input affect the mean and covariance of the output)
    • one input -> one output
    • many inputs -> one output
    • many inputs -> many outputs

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_linearisation_methods.pdf

Currently there a 3 notebooks on the extended Kalman filter:

kalmanfilter_chapter_13_ekf_1.ipynb

derives and summarises the equations required for the extended Kalman filter.

kalmanfilter_chapter_13_ekf_2.ipynb

reproduces an example of the book and includes a simulation

`kalmanfilter_chapter_13_ekf_3.ipynb`

reproduces another example from chapter 13 of the book.

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_chapter_13_ekf_1.pdf

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_chapter_13_ekf_2.pdf

https://github.com/michaelbiester/KalmanFilter/blob/master/kalmanfilter_chapter_13_ekf_3.pdf

chapter 14

In the book chapter 14 covers the unscented-Kalman filter (UKF).

To book presents the unscented-Kalman filter in a `cookbook`-style. Topics like sigma points are just stated without much background information.

As preperational steps I had to review the concept of statistical linear regression which is explained in the annex of the book.

statistical_linear_regression.ipynb

is a notebook on statistical linear regression . It basically follows the steps decribed in annex F of the book.

https://github.com/michaelbiester/KalmanFilter/blob/master/statistical_linear_regression.pdf

I would have liked are more detailed discussion on the concept of sigma-points than what is provided by the book.


Preliminary Summary & Conclusion

I found the article

a general method for approximating nonlinear transformations for probability distributions, authors: Simon Julier, Jeffrey Uhlmann

helpful. The notebook

unscented_transform_and_sigma_points.ipynb

https://github.com/michaelbiester/KalmanFilter/blob/master/unscented_transform_and_sigma_points.pdf

reviews parts of the article and provides a numerical example which computes the covariance matrix after nonlinear transformation using different approaches:

  • evaluate covariance by simulation
  • evaluate covariance using sigma points
  • evaluate covariance using linearisation techniques (similar to the techniques used in the extended Kalman filter (EKF)

What is still missing is a notebook which

  • summarises main facts of the uncented Kalman filter
  • provides a numerical example