In machine learning, Guassian Processes (GPs) are commonly used modelling tools for Bayesian non-parametric inference. GPs, with a suitable choice of the kernel, are universal function approximators. A drawback with GPs is that the computational cost is cubic, O(n^3) , in the number of observations. For example, a dataset containing 20’000 observations (this is not even considered large nowadays) will take hours to return. This makes GPs unsuitable for Big Data. Several general sparse approximation schemes have been proposed for this problem. We have developed different approaches to approximate GPs with complexity O(n).

- Correlated Product of Experts for Sparse Gaussian Process Regression. In: Machine Learning, 2022.
- Sparse Information Filter for Fast Gaussian Process Regression. European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases ECML PKDD, 2021.
- Recursive estimation for sparse Gaussian process regression. In: Automatica, vol. 120, pp. 109-127, 2020, ISSN: 0005-1098.
- Orthogonally Decoupled Variational Fourier Features. In: arXiv preprint arXiv:2007.06363, 2020.