A framework for solving functional equations with neural networks


Contact
lkindermann [ at ] awi-bremerhaven.de

Abstract

In his essay towards a calculus of functions from 1815Charles Babbage introduced a branch of mathematicsnow known as the theory of functional equations. Butsince then finding specific solutions for a given functionalequation remained a hard task in many cases. For one ofhis examples, the now famous Babbage equationg(g(x))=x, which solutions g are called the roots ofidentity and the more general equation g(g(x))=f(x)which defines kind of a square root of some given functionf, we have previously shown that this type of equationcan be solved approximately by neural networks with aspecial topology and learning rule. Here we extend thatmethod towards a wider range of functional equationswhich can be mapped in similar ways to neural networkstoo. The method is demonstrated on - but not limited to -multilayer perceptrons. We present a first sketch of thisideas here on some important equations.



Item Type
Conference (Conference paper)
Authors
Divisions
Programs
Peer revision
Not peer-reviewed
Publication Status
Published
Event Details
Proceedings of the Eight International Conference on Neural Information Processing (ICONIP'2001), Fudan University Press, Shanghai.
Eprint ID
10410
Cite as
Kindermann, L. , Lewandowski, A. and Protzel, P. (2001): A framework for solving functional equations with neural networks , Proceedings of the Eight International Conference on Neural Information Processing (ICONIP'2001), Fudan University Press, Shanghai .


Share

Research Platforms
N/A

Campaigns


Actions
Edit Item Edit Item