Natural Interpolation of Time Series
To interpolate data which is sampled in finite, discrete time steps into a continuous signal e.g. for resampling, normallya model has to be introduced for this purpose, like linear interpolation, splines, etc. In this paper we attemptto derive a natural method of interpolation, where the correct model is derived from the data itself, using somegeneral assumptions about the underlying process. Applying the formalism of generalized iteration, iteration semigroupsand iterative roots we attempt to characterize a method to find out if such a natural interpolation for agiven time series exists and give a method for its calculation, an exact one for linear autoregressive time seriesand a neural network approximation for the general nonlinear case.