Title
Implementation of the RBF neural chip with the back-propagation algorithm for on-line learning.
Abstract
This paper presents the hardware implementation of the floating-point processor (FPP).Radial basis function (RBF) neural network is developed on FPGA.FPP is designed to implement the back-propagation algorithm in detail.The on-line learning process of the RBF chip is compared numerically with the results of the MATLAB program.The performance of the designed RBF neural chip is tested for the real-time pattern classification of the XOR logic.Performances are evaluated by comparing results from the MATLAB through extensive experimental studies. This article presents the hardware implementation of the floating-point processor (FPP) to develop the radial basis function (RBF) neural network for the general purpose of pattern recognition and nonlinear control. The floating-point processor is designed on a field programmable gate array (FPGA) chip to execute nonlinear functions required in the parallel calculation of the back-propagation algorithm. Internal weights of the RBF network are updated by the online learning back-propagation algorithm. The on-line learning process of the RBF chip is compared numerically with the results of the RBF neural network learning process written in the MATLAB program. The performance of the designed RBF neural chip is tested for the real-time pattern classification of the XOR logic. Performances are evaluated by comparing results from the MATLAB through extensive experimental studies.
Year
DOI
Venue
2015
10.1016/j.asoc.2014.12.018
Appl. Soft Comput.
Keywords
Field
DocType
fpga,floating point processor
Radial basis function,MATLAB,Floating-point unit,Computer science,Hierarchical RBF,Field-programmable gate array,XOR gate,Chip,Artificial intelligence,Artificial neural network,Machine learning
Journal
Volume
Issue
ISSN
29
C
1568-4946
Citations 
PageRank 
References 
7
0.55
24
Authors
2
Name
Order
Citations
PageRank
Junseok Kim129338.33
Seul Jung252066.90