E - Telenet Users

Alfabetische lijst
A
AR(1)
The AR(1) process is defined as
where W t is a stationary time series, et is a white noise error term, and Ft is called the
forecasting function.
(figure V.I.1-1)
We can now easily observe how the theoretical ACF of an AR(1) process should look
like.
AR(2)
The AR(2) process is defined as
where W t is a stationary time series, et is a white noise error term, and Ft is the
forecasting function.
The theoretical ACF and PACF are illustrated below. Figure (V.I.1-2) contains two
possible ACF and PACF patterns for real roots while figure (V.I.1-3) shows the ACF
and PACF patterns when the roots are complex.
(figure V.I.1-2)
(figure V.I.1-3)
ARIMA model
Auto Regressive Integrated Moving Average model. An ARIMA model can be written as
the statistical formula that (adequately) describes a non-stationary time series. Hence, it
contains lambda transformations, integration processes, and ARMA models.
Auto Correlation Function (ACF) of a time series
is the series of values that correspond to the correlation of the time series and its own
past:




first value = correlation between Yt and Yt-1
second value = correlation between Yt and Yt-2
third value = correlation between Yt and Yt-3
etc...
B
backforecasts
are forecasts of past (unobserved) values of the stationary time series. Backforecasts are
used in the estimation algorithm and must converge to zero. If convergence is not
achieved, the estimated parameters are biased
Box-Cox transformation (lambda-transform) of a time series: this is a
mathematical function that - if properly used - may induce stability of the Standard
Deviation (satisfying the second condition of stationarity).
C
Cumulative periodogram
relates the cumulative intensity of independent cyclic waves that are present in the time
series to their periods (= length of a cyclic wave).
The spectrum chart provides a visual representation of the relationship between:
 the frequency of all sinusoids that are contained in the time series
 the intensity at which every sinusoid is present in the time series
The weighted sum of all sinusoids (x-axis) is equal to the original time series. The
weights of this sum are equal to the intensities (y-axis) of each respective sinusoid.
The decomposition of a time series into its independent sinusoids is called "spectral
analysis".
The cumulative periodogram is the visual representation of the relationship between the
period of all sinusoids (x-axis) versus the cumulative intensity of all sinusoids that have a
period equal to, or larger than the period of any given sinusoid.
Cyclic waves
are regular (periodic) movements that cause the time series to go up and down. Cyclic
waves can be modeled by the use of sinusoids which are weighted sums of a cosine and
sine of a given period. The period of a sinusoid is the length (measured in time) that is
needed to complete a single cycle: the time that passes between two nearest tops of the
sinusoid. The amplitude of a cyclic wave is a measure for the distance between the top
and the bottom of the sinusoid.
D
The distribution function
The distribution function of a random variable Z is the function that gives the probability
of Z being less than or equal to an real number z:
F(z) = P(Z=<z)
Drifted Random-Walk
a time series where each observation is equal to the previous observation plus a fixed
constant number (which may be negative) plus an on-average-zero normal random
number (which may be negative). The formula of the Drifted Random-Walk proces is: Yt
= Yt-1 + c + et where c is a fixed number, and et is a random value (drawn from a normal
distribution with zero mean).
Drifted Seasonal Walk
a time series where each observation is equal to last year's corresponding observation
(same month, or same quarter) plus an on-average-zero normal random number (which
may be negative) plus a constant term (which may be negative). The formula of the
Seasonal Walk proces is: Yt = Yt-s + c + et where c is a constant, and et is a random value
(drawn from a normal distribution with zero mean)
E
estimated residuals
are the computed interpolation forecast errors of an ARMA model. The estimated
residuals should behave like a (Gaussian) white noise variable.
explicit equation in terms of stochastic innovations
is the equation that relates future realizations of the time series under investigation Yt to
past stochastic innovations (et).
explicit forecasting function
is the equation that relates the forecast Ft to past observations of Yt.
F
Frequency of a cyclic wave
is the inverse of the period of a wave-like movement. If for instance, a short-term
business cycle would exist with an average cycle period of 8 months, then the frequency
of that cycle would be equal to 1/8 = 0.125. This means that every single monthly
observation of the time series under investigation represents in fact 12.5% of a business
cycle that lasts for about 8 months.
I
implicit ARIMA equation
is written in the form of an ARIMA(p,d,q)(P,D,Q) model with multiplicative seasonality:
Integration process (Integration model)
if a time series exhibits a 'strong' form of non-seasonal (first-order, second-order, thirdorder, etc...) autocorrelation, and if the trend-like behaviour of the series can be
removed by the use of non-seasonal differencing, the time series is said to be generated
by an Integration process (or the time series can be modelled by an Integration model).
invertibility of MA processes
implies that the parameters of the MA process lie inside the acceptable region of
parameter combinations that result in a non-explosive forecast.
L
lambda-transform of a time series
this is a mathematical function that - if properly used - may induce stability of the
Standard Deviation (satisfying the second condition of stationarity).
M
MA(1)
The definition of the MA(1) process is given by
(V.I.1-139)
where W t is a stationary time series, e t is a white noise error component, and F t is
the forecasting function.
The theoretical ACF and PACF for the MA(1) are illustrated in figure (V.I.1-4).
MA(2)
By definition the MA(2) process is
which can be rewritten as
where W t is a stationary time series, e t is a white noise error component, and F t is
the forecasting function.
These two possible cases are shown in figures (V.I.1-5) and (V.I.1-6).
(figure V.I.1-5)
(figure V.I.1-6)
Most Probable Change due to Chance
is equal to the standard error of the 1-step ahead forecast of a statistical model that
adequately describes the time series. In a non-drifted non-seasonal random-walk time
series for instance, the most probable change due to chance is equal to the standard
error of the first order non-seasonal difference (d=1) of the time series.
P
Partial Auto Correlation Function (PACF)
is a special kind of Auto Correlation Function. The correlation coefficients are computed
such that the all correlations are independent of each other. This implies that the
correlations in the PACF are not mutually correlated (unlike in the ordinary ACF).
Probability function
The probability function of the random variable Z, denoted by f (z) is the function that
gives the probability of Z taking the value z, for any real number z:
f (z) = P (Z=z)
R
Random-Walk
a time series where each observation is equal to the previous observation plus an onaverage-zero normal random number (which may be negative). The formula of the
Random-Walk process is: Yt = Yt-1 + et where et is a random value (drawn from a normal
distribution with zero mean).
estimated residuals
are the computed interpolation forecast errors of an ARMA model. The estimated
residuals should behave like a (Gaussian) white noise variable.
S
Seasonal Integration process
if a time series exhibits a 'strong' form of seasonal autocorrelation, and if the (strongly)
seasonal behaviour of the series can be removed by the use of seasonal differencing, the
time series is said to be generated by a Seasonal Integration process (or the time series
can be modelled by a Seasonal Integration model).
Seasonal Walk
a time series where each observation is equal to last year's corresponding observation
(same month, or same quarter) plus an on-average-zero normal random number (which
may be negative). The formula of the Seasonal Walk proces is: Yt = Yt-s + et where et is a
random value (drawn from a normal distribution with zero mean).
Sinusoid
is a regular wave-like movement with a fixed period (or frequency), and a fixed
amplitude. In fact it can be computed as a "weighted sum" of a sine and a cosine
function, at a given frequency.
Spectrum of a time series
relates the intensity (= amplitude in %) of independent cyclic waves that are present in
the time series to their frequencies (= 1 / length of a cyclic wave). The spectrum can be
interpreted as a "fingerprint" of the time series, and it enables the researcher to identify
the strongest (c.q. most important) cyclic waves of the time series under investigation.
stability of AR processes
implies that the parameters of the AR process lie inside the acceptable region of
parameter combinations that result in a non-explosive forecast.
Standard Deviation-Mean Plot (SMP)
is the scatter plot of the Standard Deviation versus Mean of sequential sections of the
time series. Each section contains the same number of observations, preferably s periods
(s = 12 for monthly time series). The regression line of the SMP illustrates how the
Standard Deviation is related to the level (=mean) of the time series.
Stationary Time Series
is a time series that satisfies the following two conditions:
 first condition: the time series is NOT Integrated; its variance cannot be reduced
by applying any combination of seasonal and non-seasonal differencing. (Note:
this is only a "rule of thumb", not a formal or scientific description!!!)
 second condition: the most probable change due to chance is constant over
time. This implies a constant standard error over the whole time period of the
time series. This condition is imposed in order to be able to easily differentiate
between changes that are due to chance, and changes that can be attributed to
some important exogenous event.
Stationary Time Series (revised definition): is a time series that satisfies the
following two conditions:
 first condition: the time series is NOT Integrated. This implies that the Auto
Correlation Function MUST NOT contain a series of slowly decreasing:
o non-seasonal autocorrelation coefficients (at time lag 1, 2, 3, etc...)
o seasonal autocorrelation coefficients (at time lag s, 2s, 3s, ... with s =
seasonal period)
Alternatively this condition implies that:
o the Spectrum does not show evidence of strong (important) cyclic waves of
very low frequency (long periods)
o the Spectrum does not show any evidence of strong cyclic waves of
seasonal frequency (or periods s, s/2, s/3, s/4, ... with s = seasonal
period)

second condition: the most probable change due to chance is constant over
time. This implies a constant standard error over the whole time
period of the time series. This condition is imposed in order to
be able to easily differentiate between changes that are due to
chance, and changes that can be attributed to some important
exogenous event.
stochastic innovations
are unsystematic (random) resultants of an unlimited number of (independent) causal
impulses. By definition, stochastic innovations are unpredictable; only the probability
density function is (assumed to be) known. Under general (and weak) conditions it can
be assumed that stochastic innovations are normally distributed (cfr. Gaussian white
noise).
Stochastic process
A statistical time series can be considered to be the result from some underlying
statistical stochastic process. The process is represented by a mathematical model. The
time series is a single realization of the generating process.
T
Time series
A time series is a set of observations ordered in time. In most cases (and in the context
of this course) it is assumed that time series are equi-distant.
An equi-distant time series is discrete, with observations Yt at times t = 1, 2, 3, ..., T,
where T is the length of the time series (= number of observations).
Example 1: Monthly sales data are assumed to be equi-distant (even though not every
month has the same number of working days.
Example 2: Quotes on the stock market are not equi-distant because the interval
between two sequential quotes is always different (this may range from a small fraction
of 1 second to 30 seconds or even more). Since equi-distant time series are much easier
to use, most stock market data providers compute aggregated quotes and prices: most
commonly they offer 1, 2, 5, 10, 20, 60-minute data, daily averages, or daily closing
prices.
V
Variance Reduction Matrix
a table containing the variances of the time series under investigation after various
combinations of various degrees of non-seasonal and seasonal differencing. This table is
used to identify the differencing combination that yields the lowest variance (c.q.
explains the time series best).
W
(Gaussian) white noise
is a zero-mean, homoskedastic, uncorrelated, and normally distributed time series.
Wold's decomposition theorem
The most fundamental justification for time series analysis (as described in this course) is
due to Wold's decomposition theorem, where it is explicitly proved that any (stationary)
time series can be decomposed into two different parts. The first (deterministic) part can
be exactly described by a linear combination of its own past, the second part is a MA
component of a finite order.