IAM

MARCH2020

READING

Juncheng Li, Frank R. Schmidt, J. Zico Kolter. Adversarial camera stickers: A physical camera-based attack on deep learning systems. ICML 2019: 3896-3904.

Li et al. propose camera stickers that when computed adversarially and physically attached to the camera leads to mis-classification. As illustrated in Figure 1, these stickers are realized using circular patches of uniform color. These individual circular stickers are computed in a gradient-descent fashion by optimizing their location, color and radius. The influence of the camera on these stickers is modeled realistically in order to guarantee success.

Figure 1: Illustration of adversarial stickers on the camera (left) and the effect on the taken photo (right).

Also find this summary on ShortScience.org.
What is your opinion on this article? Let me know your thoughts on Twitter @davidstutz92 or LinkedIn in/davidstutz92.