A framework for solving functional equations with neural networks
In his essay towards a calculus of functions from 1815Charles Babbage introduced a branch of mathematicsnow known as the theory of functional equations. Butsince then finding specific solutions for a given functionalequation remained a hard task in many cases. For one ofhis examples, the now famous Babbage equationg(g(x))=x, which solutions g are called the roots ofidentity and the more general equation g(g(x))=f(x)which defines kind of a square root of some given functionf, we have previously shown that this type of equationcan be solved approximately by neural networks with aspecial topology and learning rule. Here we extend thatmethod towards a wider range of functional equationswhich can be mapped in similar ways to neural networkstoo. The method is demonstrated on - but not limited to -multilayer perceptrons. We present a first sketch of thisideas here on some important equations.