IAM

JUNE2016

READING

M. Kiefel, V. Jampani, P. V. Gehler. Permutohedral Lattice CNNs. Computing Research Repository, abs/1412.6618, 2014.

Kiefel et al. introduce convolutional neural networks based on the permutohedral lattice. The main idea is the construction of a permutohedral lattice, replacing the regular grid in two or three dimension used for convolutions. In this sense, convolutional layers are replaced by the following operations:

  • splat: the input is mapped to the lattice, see [1] for details;
  • convolution: the convolution is performed within the lattice;
  • slice: the mapping back from the lattice.

An illustration can be found in Figure 1. According to Kiefel et al., the main advantage of this construction is the ability to train convolutional neural networks on sparse input data (in their paper, this is demonstrated on the MNIST [2] dataset). In another application scenario, the permutohedral lattice allows to incorporate more complex invariances such as rotation (in contrast to convolutions on a regular grid which are usually only invariant to translation).

kiefel

Figure 1 (click to enlarge): Illustration of the splat, convolution and slice operations; taken from the paper.

Unfortunately, Kiefel et al. do not give details on training permutohedral lattice convolutional neural networks as they omit the corresponding backpropagation rules. Furthermore, the experiments are not discussed in detail.

  • [1] A. Adams, J. Baek, M. A. Davis. Fast High-Dimensional Filtering Using the Permutohedral Lattice. Comput. Graph. Forum, volume 29, number 2, 2010.
  • [2] Y. LeCun, L. Bottou, Y. Bengio, P. Haffner. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278-2324, November 1998.
What is your opinion on this article? Let me know your thoughts on Twitter @davidstutz92 or LinkedIn in/davidstutz92.