Energy Efficient Learning With Low Resolution Stochastic Domain Wall Synapse for Deep Neural Networks
Résumé
We demonstrate extremely low resolution quantized (nominally 5-state) synapses with large stochastic variations in synaptic weights can be energy efficient and achieve reasonably high testing accuracies compared to Deep Neural Networks (DNNs) of similar sizes using floating-point precision synaptic weights. Specifically, voltage-controlled domain wall (DW) devices demonstrate stochastic behavior and can only encode limited states; however, they are extremely energy efficient during both training and inference. In this study, we propose both in-situ and ex-situ training algorithms, based on modification of the algorithm proposed by Hubara et al., 2017 which works well with quantization of synaptic weights, and train several 5-layer DNNs on MNIST dataset using 2-, 3-and 5-state DW devices as a synapse. For insitu training, a separate high precision memory unit preserves and accumulates the weight gradients which prevents accuracy loss due to weight quantization. For ex-situ training, a precursor DNN is first trained based on weight quantization and DW device model. Moreover, a noise tolerance margin is included in both of the training methods to account for the intrinsic device noise. The highest inference accuracies we obtain after in-situ and ex-situ training are ∼ 96.67% and ∼96.63%, respectively, which is very close to the baseline accuracy of ∼97.1% obtained from a similar topology DNN having floating-point precision weights with no stochasticity. Large interstate intervals due to quantized weights and noise tolerance margin enables in-situ training with significantly lower number of programming attempts. Our proposed approach demonstrates a possibility of at least two orders of magnitude energy savings compared to the floating-point approach implemented in CMOS. This approach is specifically attractive for low power intelligent edge devices where the ex-situ learning can be utilized for energy efficient non-adaptive tasks and the in-situ learning can provide the opportunity to adapt and learn in a dynamically evolving environment.
Fichier principal
Energy_Efficient_Learning_With_Low_Resolution_Stochastic_Domain_Wall_Synapse_for_Deep_Neural_Networks.pdf (1.45 Mo)
Télécharger le fichier
Origine | Fichiers éditeurs autorisés sur une archive ouverte |
---|