Neural Networks, IEEE - INNS - ENNS International Joint Conference on
Download PDF

Abstract

Multilayer feedforward neural network (MFNN) trained by the backpropagation (BP) algorithm is one of the most significant models in artificial neural networks. MFNNs have been used in many areas of signal and image processing due to high applicability. Although they have been implemented as analog, mixed analog-digital and filly digital VLSI circuits, it is still difficult to realize their hardware implementation with BP learning function. This paper describes the BP algorithm for the logic oriented neural network (LOGO-NN), which we have proposed as a sort of MFNN with quantized weights and multilevel threshold neurons. As both weights and neuron outputs are quantized to integer values in LOGO-NNS, it is expected that LOGO-NNS with BP learning can be more effectively implemented than the common MFNNs. Finally; it is shown by simulations that the proposed BP algorithm has good performance for LOGO-NNS.
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!