Journal Article


Efficient and low overhead memristive activation circuit for deep learning neural networks

Abstract

An efficient memristor MIN function based activation circuit is presented for memristive neuromorphic systems, using only two memristors and a comparator. The ReLU activation function is approximated using this circuit. The ReLU activation function helps to significantly reduce the time and computational cost of training in neuromorphic systems due to its simplicity and effectiveness in deep neural networks. A multilayer neural network is simulated using this activation circuit in addition to traditional memristor crossbar arrays. The results illustrate that the proposed circuit is able to perform training effectively with significant savings in time and area in memristor crossbar based neural networks.

Attached files

  • Type: PDF Document Filename: s10.pdf Size: 1.4 MB Views (since Sept 2022): 289

Authors

Bala, Anu
Yang, Xiaohan
Adeyemo, Adedotun
Jabir, Abusaleh

Oxford Brookes departments

Faculty of Technology, Design and Environment\School of Engineering, Computing and Mathematics

Dates

Year of publication: 2019
Date of RADAR deposit: 2019-04-08


Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License


Related resources

This RADAR resource is Identical to Efficient and low overhead memristive activation circuit for deep learning neural networks

Details

  • Owner: Joseph Ripp
  • Collection: Outputs
  • Version: 1 (show all)
  • Status: Live
  • Views (since Sept 2022): 347