BitPruning: Learning Bitlengths for Aggressive and Accurate Quantization - CNRS - Centre national de la recherche scientifique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2020

BitPruning: Learning Bitlengths for Aggressive and Accurate Quantization

Résumé

Neural networks have demonstrably achieved state-of-the art accuracy using low-bitlength integer quantization, yielding both execution time and energy benefits on existing hardware designs that support short bitlengths. However, the question of finding the minimum bitlength for a desired accuracy remains open. We introduce a training method for minimizing inference bitlength at any granularity while maintaining accuracy. Furthermore, we propose a regularizer that penalizes large bitlength representations throughout the architecture and show how it can be modified to minimize other quantifiable criteria, such as number of operations or memory footprint. We demonstrate that our method learns thrifty representations while maintaining accuracy. With ImageNet, the method produces an average per layer bitlength of 4.13 and 3.76 bits on AlexNet and ResNet18 respectively, remaining within 2.0% and 0.5% of the baseline TOP-1 accuracy.

Dates et versions

hal-02487489 , version 1 (21-02-2020)

Identifiants

Citer

Miloš Nikolić, Ghouthi Boukli Hacene, Ciaran Bannon, Alberto Delmas Lascorz, Matthieu Courbariaux, et al.. BitPruning: Learning Bitlengths for Aggressive and Accurate Quantization. 2020. ⟨hal-02487489⟩
72 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More