Proceedings 2002 Pacific Rim International Symposium on Dependable Computing
Download PDF

Abstract

We propose an efficient method for making multi-layered neural networks(MLN) fault-tolerant to all multiple weight faults in an interval by injecting intentionally two extreme values in the interval in a learning phase. The degree of fault-tolerance to a multiple weight fault is measured by the number of essential multiple links. First, we analytically discuss how to choose effectively the multiple links to be injected, and present a learning algorithm for making MLNs fault tolerant to all multiple (i.e., simultaneous) faults in the interval defined by two multi-dimensional extreme points. Then it is shown that after the learning algorithm successfully finishes, MLNs become fault tolerant to all multiple faults in the interval. The time in a weight modification cycle is almost linear for the fault multiplicity. The simulation results show that the computing time drastically reduces comparing with [1] as the multiplicity increases.
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!

Related Articles