1. Science
  2. Видання
  3. Системи обробки інформації
  4. 1(156)'2019
  5. Designing of array neuron-equivalentors with a quasi-universal activation function for creating a self-learning equivalent- convolutional neural structures

Designing of array neuron-equivalentors with a quasi-universal activation function for creating a self-learning equivalent- convolutional neural structures

 V. Krasylenko, O. Lazariev, O. Sheremeta
Системи обробки інформації. — 2019. — № 1(156). С. 82-91.
UDK 4.93
Article language: english
Annotations languages:


Annotation: In the paper, we consider the urgent need to create highly efficient hardware accelerators for machine learning algorithms, including convolutional and deep neural networks, for associative memory models, clustering, and pattern recognition. We show a brief overview of our related works the advantages of the equivalent models (EM) for designing bio-inspired systems. Such EM-paradigms are very perspective for processing, clustering, recognition, storing large size, strongly correlated, highly noised images and creating of uncontrolled learning machine. And since the basic nodes of EM are such vector-matrix (matrix-tensor procedures with continuous-logical operations as: normalized vector operations "equivalence", "nonequivalence", and etc., we consider in this paper new conceptual approaches to the design of full-scale arrays of such neuron-equivalentors (NEs) with extended functionality, including different activation functions. Our approach is based on the use of analog and mixed (with special coding) methods for implementing the required operations, building NEs (with number of synapsis from 8 up to 128 and more) and their base cells, nodes based on photosensitive elements and current mirrors. Simulation results show that the efficiency of NEs relative to the energy intensity is estimated at a value of not less than 1012 an. op. / sec on W and can be increased. The results confirm the correctness of the possibility of creating NE and MIMO structures on their basis.


Keywords: self-learning equivalent-convolutional neural structure, neuron-equivalentor, current mirror, hardware accelerators, equivalent model, continuous-logic, activation function, nonlinear processing, recognition

References

1. Krasilenko, V.G., Saletsky, F.M., Yatskovsky, V.I. and Konate, K. (1998), “Неперервно-логічні еквівалентністні моделі архітектур нейронних мереж Хеммінга з адаптивно-кореляційним зважуванням” [Continuous logic equivalence models of Hamming neural network architectures with adaptive-correlated weighting], Proceedings of SPIE, Vol. 3402, pp. 398-408.
2. Krasilenko, V.G. and Magas, A.T. (1997), “Bahatoportova optychna asotsiatyvna pamiat na osnovi matrychno-matrychnykh ekvivalentoriv” [Multiport optical associative memory based on matrix-matrix equivalentors], Proceedings of SPIE, Vol. 3055, pp. 137-146.
3. Krasilenko, V.G., Lazarev, A. and Grabovlyak, S. (2012), “Proektuvannia ta modeliuvannia hetero-asotsiatyvnoi pamiati bahatoportovoi neironnoi merezhi dlia optychnoho rozpiznavannia obraziv” [Design and simulation of a multiport neural network heteroassociative memory for optical pattern recognitions], Рrос. of SРІЕ, Vоl. 8398, 83980N-1.
4. Krasilenko, V.G., Lazarev, A.A. and Nikitovich, D.V. (2014), “Експериментальні дослідження методів кластеризації та вибору фрагментів зображень з використанням просторових інваріантних еквівалентністних моделей” [Experimental research of methods for clustering and selecting image fragments using spatial invariant equivalent models], Proceedings of SPIE, Vol. 9286, 928650.
5. Krasilenko, V.G., Nikolskyy, A.I. and Flavitskaya, J.A. (2010), “Struktury optychnykh neironnykh merezh na osnovi novykh matrychnykh ekvivalentnistnykh modelei (MTEM) i rezultaty yikh modeliuvannia” [The Structures of Optical Neural Nets Based on New Matrix_Tensor Equivalently Models (MTEMs) and Results of Modeling], Optical Memory and Neural Networks (Information Optics), Vol. 19 (1), pp. 31-38.
6. LeCun, Y. and Bengio, Y. (1995), Convolutional networks for images, speech, and time-series, The Handbook of Brain Theory and Neural Networks, MIT Press.
7. Shafiee, A. (2016), ISAAC: A Convolutional Neural Network Accelerator with In-Situ Analog Arithmetic in Crossbars, ACM/IEEE 43rd Annual International Symposium on Computer Architecture (ISCA), Seoul, pp. 14-26. https://doi.org/10.1109/ISCA.2016.12.
8. Di Zang, Zhenliang Chai, Junqi Zhang, Dongdong Zhang and Jiujun Cheng (2015), Vehicle license plate recognition using visual attention model and deep learning, Journal of Electronic Imaging 24(3), 033001. http://dx.doi.org/10.1117/1.JEI.24.3.033001.
9. Krasilenko, V.G., Lazarev, A.A. and Nikitovich, D.V. (2017), “Modeliuvannia ta mozhlyva realizatsiia samonavchalnykh ekvivalentnistno-zghortkovykh neironnykh struktur dlia avtokoduvannia-dekoduvannia ta klasteryzatsii zobrazhen” [Modeling and possible implementation of self-learning equivalence-convolutional neural structures for auto-encoding-decoding and clusterization of images], Proceedings of SPIE, Vol. 10453, 104532N.
10. Krasilenko, V.G., Lazarev, A.A. and Nikitovich, D.V. (2018), “Modeliuvannia biolohichno motyvovanykh samonavchalnykh ekvivalentno-zghortkovykh rekurentno-bahatosharovykh neironnykh struktur (BLM_SL_EC_RMNS) dlia klasteryzatsii ta rozpiznavannia frahmentiv zobrazhennia” [Modeling of biologically motivated self-learning equivalent-convolutional recurrent-multilayer neural structures (BLM_SL_EC_RMNS) for image fragments clustering and recognition], Proc. SPIE 10609, MIPPR 2017: Pattern Recognition and Computer Vision, 106091D. https://doi.org/10.1117/12.2285797.
11. Krasilenko, V.G., Lazarev, A.A. and Nikitovich, D.V. (2018), “Proektuvannia ta modeliuvannia optoelektronnykh neiron-ekvivalentoriv yak aparatnykh pryskoriuvachiv samonavchalnykh ekvivalentnistnykh zghortkovykh neiro-struktur (SNEZNS)” [Design and simulation of optoelectronic neuron equivalentors as hardware accelerators of self-learning equivalent convolutional neural structures (SLECNS)], Proceedings of SPIE, Vol. 10689, 106890C.
12. Krasilenko, V.G., Lazarev, A.A. and Nikitovich, D.V. (2018), “Proektuvannia ta modeliuvannia elementiv masyvu dlia transformatsii intensyvnosti zobrazhennia ta koduvannia, shcho vykorystovuiutsia v protsesakh obrobky zmishanykh zobrazhen ta neironnykh merezh” [Design and simulation of array cells for image intensity transformation and coding used in mixed image processors and neural networks], Proceedings of SPIE, Vol. 10751, 1075119.
13. Schlottmann, C.R. and Hasler, P.E. (2011), A Highly Dense, Low Power, Programmable Analog Vector-Matrix Multiplier: The FPAA Implementation, IEEE Journal on Emerging and Selected Topics in Circuits and Systems, Vol. 1, No. 3, pp. 403-411. https://doi.org/10.1109/JETCAS.2011.2165755.
14. Lu, J., Young, S., Arel, I., and Holleman, J. (2015), A 1 TOPS/W analog deep machine-learning engine with floating-gate storage in 0.13 μm CMOS, IEEE JSSC, Vol. 50, available at ieeeexplore.ieee.org.
15. Lu, J. and Holleman, J. (2013), A floating-gate analog memory with bidirectional sigmoid updates in a standard digital process, Proc. ISCAS’13, pp. 1600-1603.
16. Lu, J., Young, S., Arel, I. and Holleman, J. (2013), An analog online clustering in 130 nm CMOS, Proc. A-SSCC, pp. 177-180.
17. Krasilenko, V.G., Ogorodnik, K.V., Nikolskyy, A.I. and Dubchak, V.N. (2011), “Simeistvo optoelektronnykh foto-strumovykh rekonfihurovanykh universalnykh (abo bahatofunktsionalnykh) lohichnykh elementiv (OPR ULE) na osnovi neperervno-lohichnykh operatsii (NLO) i viddzverkaliuvachiv strumu (VdS)” [Family of optoelectronic photocurrent reconfigurable universal (or multifunctional) logical elements (OPR ULE) on the basis of continuous logic operations (CLO) and current mirrors (CM)], Proceedings of SPIE, Vol. 8001, 80012Q.
18. Krasilenko, V.G., Nikolskyy, A.I. and Lazarev, A.A (2015), “Проектування та моделювання інтелектуального багатофункціонального неперервно-логічного пристрою як базової комірки сучасних високопродуктивних сенсорних систем з MIMO-структурою” [Designing and simulation smart multifunctional continuous logic device as a basic cell of advanced high-performance sensor systems with MIMO-structure], Proceedings of SPIE, Vol. 9450, 94500N.

Reference:
Krasylenko, V.H., Lazariev, O.O. and Sheremeta, O.P. (2019), Designing of array neuron-equivalentors with a quasi-universal activation function for creating a self-learning equivalent- convolutional neural structures, Information Processing Systems, Vol. 1(156), pp. 82-91. https://doi.org/10.30748/soi.2019.156.11.