多層ニューロ回路の新しい誤差逆伝搬法とその代数学的性質
山本 祥弘
pp. 201-209
DOI:
10.5687/iscie.9.201抄録
The back propagation method is well known as a supervised learning rule of a neural network.
In this paper, a new learning rule is proposed where the output error vector is adjusted to zero by correcting two kinds of vectors, the one is weighting vectors (matrix) and the other is an input vector of the layer. The corrected input vector has a role of a tentative teacher of the following layer. In this way, the output error is propagated backward, and is partly corrected by each weighting vector.
Computatinal method is also presented for a matrix inversion which is required in the proposed method and nonsingularity of the matix is discussed.
Simulation result of the Exclusive-OR problem shows the effectiveness of the proposed method.