Paper References
[1] 2020 Nature Nanotechnology, Memory devices and applications for in-memory computing
[2] 2020 Nature Communications. Accurate deep neural network inference using computational phase-change memory
[3] 2020 Frontiers in Neuroscience, Acceleration of deep neural network training with resistive cross-point devices: Design considerations
[4] 2020 Frontiers in Neuroscience, Mixed-precision deep learning based on computational memory
[5] 2018 Nature, Equivalent-accuracy accelerated neural-network training using analogue memory
[6] 2018 Nature Communications, Signal and noise extraction from analog memory elements for neuromorphic computing
[7] 2019 IEEE Symposium on VLSI Technologies, Capacitor-based Cross-point Array for Analog Neural Network with Record Symmetry and Linearity
[8] 2018 International Electron Devices Meeting (IEDM), ECRAM as Scalable Synaptic Cell for High-Speed, Low-Power Neuromorphic Computing
[9] 2016 Frontiers in Neuroscience, Acceleration of Deep Neural Network Training with Resistive Cross-Point Devices: Design Considerations
[10] 2020 Frontiers in Neuroscience, Algorithm for Training Neural Networks on Resistive Device Arrays
[11] 2023 APL Machine Lerning, Using the IBM analog in-memory hardware acceleration kit for neural network training and inference
[12] 2023 Nature Communications, Hardware-aware training for large-scale and diverse deep learning inference workloads using in-memory computing-based accelerators
[13] 2023 Nature Electronics, A 64-core mixed-signal in-memory compute chip based on phase-change memory for deep neural network inference
[14] 2023 Nature, An analog-AI chip for energy-efficient speech recognition and transcription