|
ABSTRACT
Title |
: |
Optimal Feed Forward MLPArchitecture for Off-Line Cursive Numeral Recognition |
Authors |
: |
Amit Choudhary, Rahul Rishi, Savita Ahlawat, Vijaypal Singh Dhaka |
Keywords |
: |
Numeral Recognition, MLP, Hidden Layers, Backpropagation, Conjugate Gradient Descent, Activation Functions. |
Issue Date |
: |
Jan 2010 |
Abstract |
: |
The purpose of this work is to analyze the performance of back-propagation feed-forward algorithm using various different activation functions for the neurons of hidden and output layer and varying the number of neurons in the hidden layer. For sample creation, 250 numerals were gathered form 35 people of different ages including male and female. After binarization, these numerals were clubbed together to form training patterns for the neural network. Network was trained to learn its behavior by adjusting the connection strengths at every iteration. The conjugate gradient descent of each presented training pattern was calculated to identify the minima on the error surface for each training pattern. Experiments were performed by selecting different combinations of two activation functions out of the three activation functions logsig, tansig and purelin for the neurons of the hidden and output layers and the results revealed that as the number of neurons in the hidden layer is increased, the network gets trained in small number of epochs and the percentage recognition accuracy of the neural network was observed to increase up to certain level and then it starts decreasing when number of hidden neurons exceeds a certain level. |
Page(s) |
: |
1-7 |
ISSN |
: |
0975–3397 |
Source |
: |
Vol. 2, Issue.1 Supplementary |
|