Hamid Beigy
Computer Engineering Department, Amirkabir University of Technology, Tehran, Iran
M. R. Meybodi
Computer Engineering Department, Amirkabir University of Technology, Tehran, Iran
M. B. Menhaj
Electrical Engineering Department,
Amirkabir University of Technology, Tehran, Iran
ABSTRACT
Error backpropagation training algorithm (BP) which is an iterative gradient descent algorithm is a simple way to train multilayer feedforward neural networks. Despite the popularity and effectiveness of this algorithm, its convergence is extremely slow. The main objective of this paper is to incorporate an acceleration technique into the BP algorithm for achieving faster rate of convergence. By interconnection of Fixed Structure Learning Automata (FSLA) to the feedforward neural networks, we apply Learning Automata (LA) scheme for adjusting the learning rate based on the observation of random response of neural networks. The feasibility of proposed method is shown through simulations on three learning problems: Exclusive-or (XOR), approximation of function sin(x) and digit recognition. These problems are chosen because they possess different error surfaces and collectively present an environment that is suitable to determine the effect of proposed method. The simulation results show that the adaptation of learning rate using this method not only increases the convergence rate of learning but increases the possibility of bypassing the local minima.
PDF References Citation
How to cite this article
Hamid Beigy, M. R. Meybodi and M. B. Menhaj, 2002. Utilization of Fixed Structure Learning Automata for Adaptation of Learning Rate in Backpropagation Algorithm. Journal of Applied Sciences, 2: 437-443.
DOI: 10.3923/jas.2002.437.443
URL: https://scialert.net/abstract/?doi=jas.2002.437.443
DOI: 10.3923/jas.2002.437.443
URL: https://scialert.net/abstract/?doi=jas.2002.437.443
REFERENCES
- Parlos, A.G., B. Fernadez, A.F. Atya, J. Muthusami and W.K. Tsai, 1994. An accelerated learning algorithm for multi-layer preceptron networks. IEEE Trans. Neural Networks, 5: 493-497.
Direct Link - Sperduti, A. and A. Starita, 1993. Speed up learning and network optimization with extended BP. Neural Networks, 6: 365-383.
CrossRef - Arabshahi, P., J.J. Choi, R.J. Marks II and T.P. Caudell, 1996. Fuzzy parameter adaptation in optimization: Some neural net training examples. IEEE Comput. Sci. Eng., 3: 57-65.
CrossRefDirect Link - Hush, D.R. and B.G. Horne, 1993. Progress in supervised neural networks. IEEE Signal Process. Magazine, 10: 8-39.
CrossRefDirect Link - Tesauro, G. and B. Janssens, 1988. Scaling relationships in back propagation learning. Complex Syst., 2: 39-44.
Direct Link - Gori, M. and A. Tesi, 1992. On the problem of local minima in back-propagation. IEEE Trans. Pattern Anal. Mach. Intell., 14: 76-86.
Direct Link - Arabshahi, P., J.J. Choi, R.J. Marks and T.P. Caudell, 1992. Fuzzy control of back propagation. Proceedings of the IEEE International Conference on Fuzzy Systems, March 8-12, San Diego, CA., USA., pp: 967-972.
CrossRefDirect Link - Vosl, T.P., J.K. Mangis, A.K. Rigler, W.T. Zink and D.L. Alkon, 1988. Accelerating the conversance of back propagation methods. Biol. Cybernet., 59: 257-263.
Direct Link