An Addition to Backpropagation for Computing Functional Roots


Contact
lkindermann [ at ] awi-bremerhaven.de

Abstract

Many processes are composed of a n-fold repetition ofsome simpler process. If the whole process can be modeledwith a neural network, we present a method to derivea model of the basic process, too, thus performing notonly a system-identification but also a decompositioninto basic blocks. Mathematically this is equivalent to theproblem of computing iterative or functional roots: Giventhe equation F(x)=f(f(x)) and an arbitrary function F(x)we seek a solution for f(x). A special topology of multilayerperceptrons and a simple addition to the delta ruleof backpropagation will allow most NN tools to computegood approximations. Applications range from data analysiswithin chaos theory to the optimization of industrialprocesses, where production lines like steel mills oftenconsist of several identical machines in a row.



Item Type
Conference (Conference paper)
Authors
Divisions
Programs
Publication Status
Published
Event Details
Proceedings of the International ICSC/IFAC Symposium on Neural Computation (NC'98), Vienna..
Eprint ID
10383
Cite as
Kindermann, L. (1998): An Addition to Backpropagation for Computing Functional Roots , Proceedings of the International ICSC/IFAC Symposium on Neural Computation (NC'98), Vienna. .


Share
Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Research Platforms
N/A

Campaigns
N/A


Actions
Edit Item Edit Item