リケラボ論文検索は、全国の大学リポジトリにある学位論文・教授論文を一括検索できる論文検索サービスです。

リケラボ 全国の大学リポジトリにある学位論文・教授論文を一括検索するならリケラボ論文検索大学・研究所にある論文を検索できる

リケラボ 全国の大学リポジトリにある学位論文・教授論文を一括検索するならリケラボ論文検索大学・研究所にある論文を検索できる

大学・研究所にある論文を検索できる 「A Dendritic Neuron Model with Adaptive Synapses Trained by Differential Evolution Algorithm」の論文概要。リケラボ論文検索は、全国の大学リポジトリにある学位論文・教授論文を一括検索できる論文検索サービスです。

コピーが完了しました

URLをコピーしました

論文の公開元へ論文の公開元へ
書き出し

A Dendritic Neuron Model with Adaptive Synapses Trained by Differential Evolution Algorithm

王 喆 富山大学

2020.09.28

概要

The human brain consists of billions of neurons, and a single neuron cell 1s constituted by a cell body, an axon, a cell membrane and a dendrite. Dendrites occupy more than 90 percent of the nerve cell organization and have a pivotal role in a human's learning process. The first artificial neuron was originally proposed by MuCulloch and Pitts in 1943. This model is an abstract and simplified model that was constructed according to the structure and working principle of a biological neuron membrane based on mathematics and algorithms called threshold logic.

However, researchers have argued that the use of McCulloch and Pitts's neuron is inadvisable because it disregards the dendritic structure in a real biology neuron. In recent years, several dendritic computing models considering the functions of dendrites in a neuron have been proposed in the literature. A dendritic neuron model (DNM) with nonlinear synapses has been proposed. Different from DMNN, DNM only considers a single neuron rather than the network of a couple of neurons, and has shown great information processing capacity

In this study, we use a dendritic neuron model with adaptive synapses (DMAS). Recent advances in neurobiology have highlighted the importance of dendritic calculation. In 2019, Beaulieu-Laroche and his team discovered that dendrites are always active when the cell body of a neuron is active, which implies that the dendritic synapse has a role in the neural computing process. Based on this biophysical hypothesis, we develop a synaptic adaptable neuron network without parameters that need to be artificially adjusted. All synaptic layer parameters will be trained by the learning algorithm. The effectiveness of adaptive synapses will be proved. Thus, we have to consider additional aspects in the choice of learning algorithms.

Five realistic classifications problems are considered in our research to validate our model (DE-DMAS): Iris, BUPA liver disorders, breast cancer, glass, and Australian credit approval (ACA). All the datasets is preprocessed as binary-classification problem. These five dataset have undergone preprocessing, including outlier repair to fill in missing values. We compare the experimental results of DE-DMAS, BP-DNM and BPNN for these five datasets. Experimental results show that DMAS outperforms its peers in terms of test accuracy, sensitivity, specificity, receiver operating characteristic (ROC) and cross-validation.

参考文献

[1] W. S. McCulloch and W. Pitts, “A logical calculus of the ideas immanent in nervous activity,” The Bulletin of Mathematical Biophysics, vol. 5, no. 4, pp. 115–133, 1943.

[2] N. Rochester, J. Holland, L. Haibt, and W. Duda, “Tests on a cell assembly the- ory of the action of the brain, using a large digital computer,” IRE Transactions on Information Theory, vol. 2, no. 3, pp. 80–93, 1956.

[3] F. Rosenblatt, “The perceptron: a probabilistic model for information storage and organization in the brain.” Psychological Review, vol. 65, no. 6, p. 386, 1958.

[4] M. Minsky and S. A. Papert, Perceptrons: An introduction to computational geometry. MIT press, 2017.

[5] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning internal repre- sentations by error propagation,” California Univ San Diego La Jolla Inst for Cognitive Science, Tech. Rep., 1985.

[6] C. Koch, “Computation and the single neuron,” Nature, vol. 385, no. 6613, p. 207, 1997.

[7] C. Koch and I. Segev, “The role of single neurons in information processing,” Nature Neuroscience, vol. 3, no. 11s, p. 1171, 2000.

[8] J. L. Davidson and F. Hummer, “Morphology neural networks: An introduction with applications,” Circuits, Systems and Signal Processing, vol. 12, no. 2, pp. 177–210, 1993.

[9] G. X. Ritter and P. Sussner, “An introduction to morphological neural networks,” in Proceedings of 13th International Conference on Pattern Recognition, vol. 4. IEEE, 1996, pp. 709–717.

[10] H. Sossa and E. Guevara, “Efficient training for dendrite morphological neural networks,” Neurocomputing, vol. 131, pp. 132–142, 2014.

[11] ——, “Modified dendrite morphological neural network applied to 3d object recognition,” in Mexican Conference on Pattern Recognition. Springer, 2013, pp. 314–324.

[12] Y. Todo, H. Tamura, K. Yamashita, and Z. Tang, “Unsupervised learnable neu- ron model with nonlinear interaction on dendrites,” Neural Networks, vol. 60, pp. 96–103, 2014.

[13] J. Ji, S. Gao, J. Cheng, Z. Tang, and Y. Todo, “An approximate logic neuron model with a dendritic structure,” Neurocomputing, vol. 173, pp. 1775–1783, 2016.

[14] Y. Tang, J. Ji, Y. Zhu, S. Gao, Z. Tang, and Y. Todo, “A differential evolution- oriented pruning neural network model for bankruptcy prediction,” Complexity, vol. 2019, Article ID 8682124, 2019.

[15] X. Qian, Y. Wang, S. Cao, Y. Todo, and S. Gao, “Mr2dnm: A novel mu- tual information-based dendritic neuron model,” Computational Intelligence and Neuroscience, vol. 2019, Article ID 7362931, 2019.

[16] T. Zhou, S. Gao, J. Wang, C. Chu, Y. Todo, and Z. Tang, “Financial time series prediction using a dendritic neuron model,” Knowledge-Based Systems, vol. 105, pp. 214–224, 2016.

[17] W. Chen, J. Sun, S. Gao, J. Cheng, J. Wang, and Y. Todo, “Using a single dendritic neuron to forecast tourist arrivals to japan,” IEICE Transactions on Information and Systems, vol. 100, no. 1, pp. 190–202, 2017.

[18] J. Ji, S. Song, Y. Tang, S. Gao, Z. Tang, and Y. Todo, “Approximate logic neuron model trained by states of matter search algorithm,” Knowledge-Based Systems, vol. 163, pp. 120–130, 2018.

[19] Y. Yu, Y. Wang, S. Gao, and Z. Tang, “Statistical modeling and prediction for tourism economy using dendritic neural network,” Computational Intelligence and Neuroscience, vol. 2017, Article ID 7436948, 2017.

[20] L. Luo and D. D. O’Leary, “Axon retraction and degeneration in development and disease,” Annual Review of Neuroscience, vol. 28, pp. 127–156, 2005.

[21] P. Hagmann, O. Sporns, N. Madan, L. Cammoun, R. Pienaar, V. J. Wedeen, R. Meuli, J.-P. Thiran, and P. Grant, “White matter maturation reshapes struc- tural connectivity in the late developing human brain,” Proceedings of the Na- tional Academy of Sciences, vol. 107, no. 44, pp. 19 067–19 072, 2010.

[22] C. Koch, T. Poggio, and V. Torre, “Retinal ganglion cells: a functional interpre- tation of dendritic morphology,” Philosophical Transactions of the Royal Society of London. B, Biological Sciences, vol. 298, no. 1090, pp. 227–263, 1982.

[23] ——, “Nonlinear interactions in a dendritic tree: localization, timing, and role in information processing,” Proceedings of the National Academy of Sciences, vol. 80, no. 9, pp. 2799–2802, 1983.

[24] L. Beaulieu-Laroche, E. H. Toloza, N. J. Brown, and M. T. Harnett, “Widespread and highly correlated somato-dendritic activity in cortical layer 5 neurons,” Neu- ron, vol. 103, no. 2, pp. 235–241, 2019.

[25] S. Gao, M. Zhou, Y. Wang, J. Cheng, H. Yachi, and J. Wang, “Dendritic neu- ral model with effective learning algorithms for classification, approximation, and prediction,” IEEE Transactions on Neural Networks and Learning Systems, vol. 30, no. 2, pp. 601–614, 2019.

[26] X. Wang, Z. Tang, H. Tamura, M. Ishii, and W. Sun, “An improved backpropa- gation algorithm to avoid the local minima problem,” Neurocomputing, vol. 56, pp. 455–460, 2004.

[27] R. Storn and K. Price, “Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997.

[28] K.-S. Tang, K.-F. Man, S. Kwong, and Q. He, “Genetic algorithms and their applications,” IEEE Signal Processing Magazine, vol. 13, no. 6, pp. 22–37, 1996.

[29] D. J. Montana and L. Davis, “Training feedforward neural networks using ge- netic algorithms.” in International Joint Conferences on Artificial Intelligence (IJCAI), vol. 89, 1989, pp. 762–767.

[30] S. Gao, S. Song, J. Cheng, Y. Todo, and M. Zhou, “Incorporation of solvent effect into multi-objective evolutionary algorithm for improved protein structure prediction,” IEEE/ACM Transactions on Computational Biology and Bioinfor- matics, vol. 15, no. 4, pp. 1365–1378, 2018.

[31] S. Song, S. Gao, X. Chen, D. Jia, X. Qian, and Y. Todo, “AIMOES: Archive information assisted multi-objective evolutionary strategy for ab initio protein structure prediction,” Knowledge-Based Systems, vol. 146, pp. 58–72, 2018.

[32] J. Kennedy, “Particle swarm optimization,” Encyclopedia of Machine Learning, pp. 760–766, 2010.

[33] J. Vesterstrom and R. Thomsen, “A comparative study of differential evolution, particle swarm optimization, and evolutionary algorithms on numerical bench- mark problems,” in Proceedings of the 2004 Congress on Evolutionary Compu- tation, vol. 2. IEEE, 2004, pp. 1980–1987.

[34] S. Gao, Y. Yu, Y. Wang, J. Wang, J. Cheng, and M. Zhou, “Chaotic local search-based differential evolution algorithms for optimization,” IEEE Transactions on Systems, Man and Cybernetics: Systems, 2019, doi: 10.1109/TSMC.2019.2956121.

[35] J. Sun, S. Gao, H. Dai, J. Cheng, M. Zhou, and J. Wang, “Bi-objective elite differential evolution for multivalued logic networks,” IEEE Transactions on Cy- bernetics, vol. 50, no. 1, pp. 233–246, 2020.

[36] B. Subudhi and D. Jena, “A differential evolution based neural network approach to nonlinear system identification,” Applied Soft Computing, vol. 11, no. 1, pp. 861–871, 2011.

[37] E. Bas, “The training of multiplicative neuron model based artificial neural net- works with differential evolution algorithm for forecasting,” Journal of Artificial Intelligence and Soft Computing Research, vol. 6, no. 1, pp. 5–11, 2016.

[38] F. Arce, E. Zamora, H. Sossa, and R. Barr´on, “Differential evolution training algorithm for dendrite morphological neural networks,” Applied Soft Computing, vol. 68, pp. 303–313, 2018.

[39] B. Farley and W. Clark, “Simulation of self-organizing systems by digital com- puter,” Transactions of the IRE Professional Group on Information Theory, vol. 4, no. 4, pp. 76–84, 1954.

[40] R. L. Rivest and C. Stein, “Thomas h. cormen, charles e. leiserson.”

[41] F. Clautiaux, S. Hanafi, R. Macedo, M.-E´. Voge, and C. Alves, “Iterative aggre- gation and disaggregation algorithm for pseudo-polynomial network flow models with side constraints,” European Journal of Operational Research, vol. 258, no. 2, pp. 467–477, 2017.

[42] B. Engquist and W. Schmid, Mathematics unlimited-2001 and beyond. Springer, 2017.

[43] M. J. Tyre and E. Von Hippel, “The situated nature of adaptive learning in organizations,” Organization Science, vol. 8, no. 1, pp. 71–83, 1997.

[44] Z. Sha, L. Hu, Y. Todo, J. Ji, S. Gao, and Z. Tang, “A breast cancer classi- fier using a neuron model with dendritic nonlinearity,” IEICE Transactions on Information and Systems, vol. 98, no. 7, pp. 1365–1376, 2015.

[45] T. Jiang, S. Gao, D. Wang, J. Ji, Y. Todo, and Z. Tang, “A neuron model with synaptic nonlinearities in a dendritic tree for liver disorders,” IEEJ Transactions on Electrical and Electronic Engineering, vol. 12, no. 1, pp. 105–115, 2017.

[46] E. Mezura-Montes, J. Vel´azquez-Reyes, and C. A. Coello Coello, “A comparative study of differential evolution variants for global optimization,” in Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation. ACM, 2006, pp. 485–492.

[47] J. Ilonen, J.-K. Kamarainen, and J. Lampinen, “Differential evolution training algorithm for feed-forward neural networks,” Neural Processing Letters, vol. 17, no. 1, pp. 93–105, 2003.

[48] S. Das and P. N. Suganthan, “Differential evolution: A survey of the state-of- the-art,” IEEE Transactions on Evolutionary Computation, vol. 15, no. 1, pp. 4–31, 2010.

[49] Y. Yu, S. Gao, Y. Wang, and Y. Todo, “Global optimum-based search differential evolution,” IEEE/CAA Journal of Automatica Sinica, vol. 6, no. 2, pp. 379–394, 2019.

[50] R. G¨amperle, S. D. Mu¨ller, and P. Koumoutsakos, “A parameter study for differ- ential evolution,” Advances in Intelligent Systems, Fuzzy Systems, Evolutionary Computation, vol. 10, no. 10, pp. 293–298, 2002.

[51] J. Ronkkonen, S. Kukkonen, and K. V. Price, “Real-parameter optimization with differential evolution,” in 2005 IEEE Congress on Evolutionary Computation, vol. 1. IEEE, 2005, pp. 506–513.

[52] D. Zaharie, “Influence of crossover on the behavior of differential evolution algo- rithms,” Applied Soft Computing, vol. 9, no. 3, pp. 1126–1138, 2009.

[53] S. Das and P. N. Suganthan, “Differential evolution: A survey of the state-of- the-art,” IEEE Transactions on Evolutionary Computation, vol. 15, no. 1, pp. 4–31, 2011.

[54] R. Jugulum, S. Taguchi et al., Computer-based robust engineering: essentials for DFSS. ASQ Quality Press, 2004.

[55] R. A. Fisher, “The use of multiple measurements in taxonomic problems,” Annals of Eugenics, vol. 7, no. 2, pp. 179–188, 1936.

[56] R. O. Duda, P. E. Hart, and D. G. Stork, Pattern classification and scene anal- ysis. Wiley New York, 1973, vol. 3.

[57] B. V. Dasarathy, “Nosing around the neighborhood: A new system structure and classification rule for recognition in partially exposed environments,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. PAMI-2, no. 1, pp. 67–71, 1980.

[58] J. McDermott and R. S. Forsyth, “Diagnosing a disorder in a classification bench- mark,” Pattern Recognition Letters, vol. 73, pp. 41–43, 2016.

[59] O. L. Mangasarian and W. H. Wolberg, “Cancer diagnosis via linear program- ming,” University of Wisconsin-Madison Department of Computer Sciences, Tech. Rep., 1990.

[60] W. H. Wolberg and O. L. Mangasarian, “Multisurface method of pattern separa- tion for medical diagnosis applied to breast cytology.” Proceedings of the National Academy of Sciences, vol. 87, no. 23, pp. 9193–9196, 1990.

[61] I. W. Evett and J. S. Ernest, “Rule induction in forensic science. central research establishment. home office forensic science service. aldermaston,” Reading, Berk- shire RG7 4PN, 1987.

[62] J. R. Quinlan, “Simplifying decision trees,” International journal of man- machine studies, vol. 27, no. 3, pp. 221–234, 1987.

[63] Y. Tang, J. Ji, S. Gao, H. Dai, Y. Yu, and Y. Todo, “A pruning neural network model in credit classification analysis,” Computational intelligence and neuro- science, vol. 2018, Article ID 9390410, 2018.

[64] J. F. Khaw, B. Lim, and L. E. Lim, “Optimal design of neural networks using the taguchi method,” Neurocomputing, vol. 7, no. 3, pp. 225–245, 1995.

[65] W. Yang and Y. Tarng, “Design optimization of cutting parameters for turn- ing operations based on the taguchi method,” Journal of Materials Processing Technology, vol. 84, no. 1-3, pp. 122–129, 1998.

[66] M. Friedman, “The use of ranks to avoid the assumption of normality implicit in the analysis of variance,” Journal of the American Statistical Association, vol. 32, no. 200, pp. 675–701, 1937.

[67] C. H. Yu, “Exploratory data analysis,” Methods, vol. 2, pp. 131–160, 1977.

[68] N. R. Cook, “Statistical evaluation of prognostic versus diagnostic models: be- yond the roc curve,” Clinical Chemistry, vol. 54, no. 1, pp. 17–23, 2008.

[69] S. Ma, H. Qiu, S. Hu, Y. Pei, W. Yang, D. Yang, and M. Cao, “Quantitative assessment of landslide susceptibility on the loess plateau in china,” Physical Geography, pp. 1–28, 2019.

参考文献をもっと見る

全国の大学の
卒論・修論・学位論文

一発検索!

この論文の関連論文を見る