# DAVIDSTUTZ

## Random and Adversarial Bit Error Training

### Abstract

The design of deep neural network (DNN) accelerators, i.e., specialized hardware for inference, has received considerable attention in past years due to saved cost, area, and energy compared to mainstream hardware. We consider the problem of random and adversarial bit errors in quantized DNN weights stored on accelerator memory. Random bit errors arise when optimizing accelerators for energy efficiency by operating at low voltage. Here, the bit error rate increases exponentially with voltage reduction, causing devastating accuracy drops in DNNs. Additionally, recent work demonstrates attacks on voltage controllers to adversarially reduce voltage. Adversarial bit errors have been shown to be realistic through attacks targeting individual bits in accelerator memory. Besides describing these error models in detail, we make first steps towards DNNs robust to random and adversarial bit errors by explicitly taking bit errors into account during training. Our random or adversarial bit error training improves robustness significantly, potentially leading to more energy-efficient and secure DNN accelerators.

### Paper

The paper is available on ArXiv:

@article{Stutz2020ARXIV,
author    = {David Stutz and Nandhini Chandramoorthy and Matthias Hein and Bernt Schiele},
title     = {On Mitigating Random and Adversarial Bit Errors},
journal   = {CoRR},
volume    = {abs/2006.13977},
year      = {2020}
}


Coming soon!