16^{th}JUNE2016

M. Kiefel, V. Jampani, P. V. Gehler. *Permutohedral Lattice CNNs*. Computing Research Repository, abs/1412.6618, 2014.

What is **your opinion** on the summarized work? Or do you know related work that is of interest? **Let me know** your thoughts in the comments below:

Kiefel et al. introduce convolutional neural networks based on the permutohedral lattice. The main idea is the construction of a permutohedral lattice, replacing the regular grid in two or three dimension used for convolutions. In this sense, convolutional layers are replaced by the following operations:

An illustration can be found in Figure 1. According to Kiefel et al., the main advantage of this construction is the ability to train convolutional neural networks on sparse input data (in their paper, this is demonstrated on the MNIST [2] dataset). In another application scenario, the permutohedral lattice allows to incorporate more complex invariances such as rotation (in contrast to convolutions on a regular grid which are usually only invariant to translation).

click to enlarge): Illustration of the splat, convolution and slice operations; taken from the paper.Unfortunately, Kiefel et al. do not give details on training permutohedral lattice convolutional neural networks as they omit the corresponding backpropagation rules. Furthermore, the experiments are not discussed in detail.

Fast High-Dimensional Filtering Using the Permutohedral Lattice.Comput. Graph. Forum, volume 29, number 2, 2010.Gradient-based learning applied to document recognition.Proceedings of the IEEE, 86(11):2278-2324, November 1998.