What are you looking for ?
IT Press Tour
RAIDON

R&D: Accurate Deep Neural Network Inference Using Computational Phase-Change Memory

Hardware results on CIFAR-10 with ResNet-32 demonstrate accuracy above 93.5% retained over one-day period, where each of the 361,722 synaptic weights is programmed on just two PCM devices organized in differential configuration.

Nature Communications has published an article written by Vinay Joshi, IBM Research – Zurich, Säumerstrasse 4, 8803, Rüschlikon, Switzerland, and King’s College London, Strand, London, WC2R 2LS, UK, Manuel Le Gallo, IBM Research – Zurich, Säumerstrasse 4, 8803, Rüschlikon, Switzerland, Simon Haefeli, IBM Research – Zurich, Säumerstrasse 4, 8803, Rüschlikon, Switzerland, and ETH Zurich, Rämistrasse 101, 8092, Zurich, Switzerland, Irem Boybat, IBM Research – Zurich, Säumerstrasse 4, 8803, Rüschlikon, Switzerland, and Ecole Polytechnique Federale de Lausanne (EPFL), 1015, Lausanne, Switzerland, S. R. Nandakumar, IBM Research – Zurich, Säumerstrasse 4, 8803, Rüschlikon, Switzerland, Christophe Piveteau, Martino Dazzi, IBM Research – Zurich, Säumerstrasse 4, 8803, Rüschlikon, Switzerland, and ETH Zurich, Rämistrasse 101, 8092, Zurich, Switzerland, Bipin Rajendran, King’s College London, Strand, London, WC2R 2LS, UK, Abu Sebastian, and Evangelos Eleftheriou, IBM Research – Zurich, Säumerstrasse 4, 8803, Rüschlikon, Switzerland.

Abstract: In-memory computing using resistive memory devices is a promising non-von Neumann approach for making energy-efficient deep learning inference hardware. However, due to device variability and noise, the network needs to be trained in a specific way so that transferring the digitally trained weights to the analog resistive memory devices will not result in significant loss of accuracy. Here, we introduce a methodology to train ResNet-type convolutional neural networks that results in no appreciable accuracy loss when transferring weights to phase-change memory (PCM) devices. We also propose a compensation technique that exploits the batch normalization parameters to improve the accuracy retention over time. We achieve a classification accuracy of 93.7% on CIFAR-10 and a top-1 accuracy of 71.6% on ImageNet benchmarks after mapping the trained weights to PCM. Our hardware results on CIFAR-10 with ResNet-32 demonstrate an accuracy above 93.5% retained over a one-day period, where each of the 361,722 synaptic weights is programmed on just two PCM devices organized in a differential configuration.

Articles_bottom
ExaGrid
AIC
ATTO
OPEN-E