Phase-change memory models for deep learning training and inference
Document Type
Conference Proceeding
Publication Date
11-1-2019
Abstract
Non-volatile analog memory devices such as phase-change memory (PCM) enable designing dedicated connectivity matrices for the hardware implementation of deep neural networks (DNN). In this in-memory computing approach, the analog conductance states of the memory device can be gradually updated to train DNNs on-chip or software trained connection strengths may be programmed one-time to the devices to create efficient inference engines. Reliable and computationally simple models that capture the non-ideal programming and temporal evolution of the devices are needed for evaluating the training and inference performance of the deep learning hardware based on in-memory computing. In this paper, we present statistically accurate models for PCM, based on the characterization of more than 10, 000 devices, that capture the state-dependent nature and variability of the conductance update, conductance drift, and read noise. Integrating the computationally simple device models with deep learning frameworks such as TensorFlow enables us to realistically evaluate training and inference performance of the PCM array based hardware implementations of DNNs.
Identifier
85075017281 (Scopus)
ISBN
[9781728109961]
Publication Title
2019 26th IEEE International Conference on Electronics Circuits and Systems Icecs 2019
External Full Text Location
https://doi.org/10.1109/ICECS46596.2019.8964852
First Page
727
Last Page
730
Recommended Citation
Nandakumar, S. R.; Boybat, Irem; Joshi, Vinay; Piveteau, Christophe; Le Gallo, Manuel; Rajendran, Bipin; Sebastian, Abu; and Eleftheriou, Evangelos, "Phase-change memory models for deep learning training and inference" (2019). Faculty Publications. 7233.
https://digitalcommons.njit.edu/fac_pubs/7233
