A Generic Multilevel Architecture for Time Series Prediction

1
A Generic Multilevel Architecture for Time
Series Prediction
Abstract—Rapidly evolving businesses generate massive amounts of time-stamped data
sequences and cause a demand for both univariate and multivariate time series forecasting.
For such data, traditional predictive models based on autoregression are often not sufficient
to capture complex nonlinear relationships between multidimensional features and the time
series outputs. In order to exploit these relationships for improved time series forecasting
while also better dealing with a wider variety of prediction scenarios, a forecasting system
requires a flexible and generic architecture to accommodate and tune various individual
predictors as well as combination methods. In reply to this challenge, an architecture for
combined, multilevel time series prediction is proposed, which is suitable for many different
universal regressors and combination methods. The key strength of this architecture is its
ability to build a diversified ensemble of individual predictors that form an input to a
multilevel selection and fusion process before the final optimized output is obtained.
Excellent generalization ability is achieved due to the highly boosted complementarity of
individual models further enforced through cross-validation-linked training on exclusive
data subsets and ensemble output postprocessing. In a sample configuration with basic
neural network predictors and a mean combiner, the proposed system has been evaluated
in different scenarios and showed a clear prediction performance gain.
INTRODUCTION
e-revolution has led to a situation in which many businesses and organizations
Recent
continuously generate massive amounts of data which constitute univariate and
multivariate time series. Predicting future values of such time series is vital for gaining
competitive advantage in the case of businesses. Time series forecasting is a very
challenging signal processing problem, as in real situations, it is typically a function of a
large number of variables most of which are unknown or inaccessible at the time of
prediction. Although these series usually appear as very noisy, nonstationary and nonlinear
signals, their histories may carry significant evidence that can be used to build a predictive
model [7], [30], [6].
www.frontlinetechnologies.org
[email protected]
+91 7200247247
2
Architecture Diagram:
CONCLUSION
This work promotes a new architecture for time series prediction, tackling recently arising
challenges of a generally increasing volume of time series data exhibiting complex
nonlinear relationships between its multidimensional features and outputs. It combines a
multilevel architecture of highly robust and diversified individual prediction models with
operators for fusion and selection that can be applied at any level of the structure.
Additionally, the system applies an intelligent smoothing algorithm as an example of the
postprediction step that often leads to significant performance gains, particularly, if the
predicted time series contains a significant noise component.
www.frontlinetechnologies.org
[email protected]
+91 7200247247
3
REFERENCES
1.
M. Aiolfi and A. Timmermann, "Persistence in Forecasting Performance and
Conditional Combination Strategies," J. Econometrics, vol. 127, nos. 1/2, pp. 31-53,
2006.
2.
V. Assimakopoulos and K. Nikolopoulos, "The Theta Model: A Decomposition
Approach to Forecasting," Int'l J. Forecasting, vol. 16, no. 4, pp. 521-530, 2000.
3.
4.
D.W. Bunn, "A Bayesian Approach to the Linear Combination of Forecasts (1975),"
Operational Research Quarterly, vol. 26, no. 2, pp. 325-329, 1975.
M. Casdagli, "Nonlinear Prediction of Chaotic Time Series," Physica, vol. 35, pp.
335-356, 1989.
5.
E.S. Gardner, "Exponential Smoothing: The State of the Art—Part II," Int'l J.
Forecasting, vol. 22, no. 4, pp. 637-666, 2006.
6.
C.L. Giles, S. Lawrence, and A.C. Tsoi, "Noisy Time Series Prediction Using Recurrent
Neural Networks and Grammatical Inference," Machine Learning, vol. 44, no. 1, pp.
335-356, 2001.
7.
T. Dietterich and R. Michalski, "Learning to Predict Sequences," Machine Learning:
An Artificial Intelligence Approach, R. Michalski, J. Carbonell, and T. Mitchell, eds.,
vol. 2, pp. 63-106, Morgan Kaufmann, 1986.
8.
9.
S. Hansen and P. Salamon, "Neural Network Ensembles," IEEE Trans. Pattern Analysis
and Machine Intelligence, vol. 10, no. 12, pp. 993-1001, Oct. 1990.
R.J. Hyndman and B. Billah, "Unmasking the Theta Method," Int'l J. Forecasting, vol.
19, no. 2, pp. 287-290, 2003.
10.
C. Igel and M. Husken, "Improving the RPROP Learning Algorithm," Proc. Second
Int'l Computer Science Conventions (ICSC) Symp. Neural Computation, pp. 115121, 2000.
www.frontlinetechnologies.org
[email protected]
+91 7200247247