hdl:10013/epic.20886
Computing iterative roots with second order training methods
Kindermann, Lars and Protzel, P.
;
Contact
lkindermann [ at ] awi-bremerhaven.de
Abstract
Iterative roots are a valuable tool for modeling and analyzingdynamical systems. They provide a natural way toconstruct a continuous time model from discrete time data.However, they are in most cases extremely difficult tocompute analytically. Previously we have demonstratedhow to use neural networks to calculate the iterativeroots and fractional iterations of functions. We used aspecial topology of MLPs together with weight sharing.This paper shows how adding a regularization term to theerror function can direct any backpropagation basedtraining method to the same result but in a fraction of epochswhen using advanced 2-nd order learning rules.
Item Type
Conference
(Conference paper)
Authors
Kindermann, Lars and Protzel, P.
;
Divisions
Programs
Publication Status
Published
Event Details
Proceedings of the International Joint Conference on Neural Networks (IJCNN'2001), Washington DC..
Eprint ID
10409
Cite as
Kindermann, L.
and
Protzel, P.
(2001):
Computing iterative roots with second order training methods
,
Proceedings of the International Joint Conference on Neural Networks (IJCNN'2001), Washington DC.
.
Share
Research Platforms
Campaigns
N/A
Actions