TY - JOUR AU - AB - World Academy of Science, Engineering and Technology International Journal of Electrical and Computer Engineering Vol:4, No:12, 2010 Improved Back Propagation Algorithm to Avoid Local Minima in Multiplicative Neuron Model Kavita Burse, Manish Manoria, Vishnu P. S. Kirar proposed in 2003 by Zweiri and has outperformed standard two- term BP in terms of low complexity and computational Abstract—The back propagation algorithm calculates the weight changes of artificial neural networks, and a common approach is to cost [1]. BP is a method for calculating the first derivatives, or use a training algorithm consisting of a learning rate and a gradient, of the cost function required by some optimization momentum factor. The major drawbacks of above learning algorithm methods. It is certainly not the only method for estimating the are the problems of local minima and slow convergence speeds. The gradient. However, it is the most efficient [2]. The major addition of an extra term, called a proportional factor reduces the limitations of this algorithm are the existence of temporary, convergence of the back propagation algorithm. We have applied the three term back propagation to multiplicative neural network local minima resulting from the saturation behavior of the learning. The algorithm is tested TI - Communications in Computer and Information Science: Improved Back Propagation Algorithm to Avoid Local Minima in Multiplicative Neuron Model DA - 2011-01-01 UR - https://www.deepdyve.com/lp/unpaywall/communications-in-computer-and-information-science-improved-back-FuHsWlNBt9 DP - DeepDyve ER -