IAM

ARTICLE

Recorded FOCA’20 Talk “Bit Error Robustness for Energy-Efficient DNN Accelerators”

In October this year, I was invited to talk at IBM’s FOCA workshop about my latest research on bit error robustness of (quantized) DNN weights. Here, the goal is to develop DNN accelerators capable to operating at low-voltage. However, lowering voltage induces bit errors in the accelerators’ memory. While such bit errors can be avoided through hardware mechanisms, such approaches are usually costly in terms of energy and area. Thus, training DNNs robust to such bit errors would enable low-voltage operation, reducing energy consumption, without the need for hardware techniques. In this 5-minute talk, I give a short overview.

Introduction

Every year, IBM Research organizes a workshop of the Future of Computing Architectures (FOCA). I was fortunate to be invited to this year's edition to present my latest research on bit error robustness of deep neural networks (DNNs). DNN accelerators, i.e., specialized hardware for DNN inference, is becoming increasingly popular recently because of reduced cost and energy consumption compared to mainstream GPUs. To further reduce energy consumption, these accelerators are commonly operated at lower voltages. This, however, induces bit errors in the included memories (e.g., SRAM or DRAM). These bit errors have direct impact on the stored (quantized) DNN weights. Thus, training DNNs that are robust to such bit errors in their weights is crucial to enable energy-efficient DNN accelerators through low-voltage operation. In my latest pre-print, we propose a combination of robust fixed-point quantization, weight clipping as regularization and random bit errors training to improve the robustness to bit errors. We show that this enables significant energy savings. In the 5-minute FOCA talk, I give a brief overview of these approaches. The paper can also be found below.

Paper on ArXiv

Talk

All other participants of the FOCA'20 workshop can be found here. However, the talks are not all publicly available.

What is your opinion on this article? Let me know your thoughts on Twitter @davidstutz92 or LinkedIn in/davidstutz92.