Check out the latest superpixel benchmark — Superpixel Benchmark (2016) — and let me know your opinion! @david_stutz


C. E. Rasmussen, C. K. I. Williams. Gaussian Processes for Machine Learning. MIT Press, 2006.

Rasmussen and Williams provide a comprehensive (and free) textbook for using Gaussian processes for regression and classification problems. Based on the provided details, Gaussian process regression is easily implemented and can also be used for classification following section 3.3 using Monte Carlo integration.

As Gaussian process prediction is quite slow, scaling in $\mathcal{O}(N^3)$ where $N$ is the number of training samples, literature on speeding up kernel methods is of interest. For example, Williams and Seeger use the Nyström approximation [1] while Rahimi and Recht consider using random Fourier features (MatLab code available) [2]. A quick overview can be found in Byron Boot's slides (second part) used for his class "Statistical Techniques in Robotics" in spring 2015.

  • [1] Christopher K. I. Williams, Matthias Seeger. Using the Nyström Method to Speed Up Kernel Machines. Advances in Neural Information Processing Systems, 2001.
  • [2] Ali Rahimi, Ben Recht. Random Features for Large-Scale Kernel Machines. Neural Information Processing Systems, 2007.

What is your opinion on the summarized work? Or do you know related work that is of interest? Let me know your thoughts in the comments below or using the following platforms: