https://doi.org/10.1140/epjp/i2015-15246-6
Regular Article
Application of different entropy formalisms in a neural network for novel word learning
Department of Physics, College of Sciences, Yasouj University, 75914-353, Yasouj, Iran
* e-mail: rezakh2025@yahoo.com
Received:
26
August
2015
Accepted:
8
November
2015
Published online:
14
December
2015
In this paper novel word learning in adults is studied. For this goal, four entropy formalisms are employed to include some degree of non-locality in a neural network. The entropy formalisms are Tsallis, Landsberg-Vedral, Kaniadakis, and Abe entropies. First, we have analytically obtained non-extensive cost functions for the all entropies. Then, we have used a generalization of the gradient descent dynamics as a learning rule in a simple perceptron. The Langevin equations are numerically solved and the error function (learning curve) is obtained versus time for different values of the parameters. The influence of index q and number of neuron N on learning is investigated for the all entropies. It is found that learning is a decreasing function of time for the all entropies. The rate of learning for the Landsberg-Vedral entropy is slower than other entropies. The variation of learning with time for the Landsberg-Vedral entropy is not appreciable when the number of neurons increases. It is said that entropy formalism can be used as a means for studying the learning.
© Società Italiana di Fisica and Springer-Verlag Berlin Heidelberg, 2015