Autoregression

Autoregression model is used to describe stationary time series. In this model, a value of an analyzed process is expressed using the finite linear population of the previous process values and a certain disturbance.

Thus, autoregression of the order p is the process Xt that satisfies the following ratio for the constant µ:

Where:

This model has p+2 unknown parameters µψ1ψ2, …, ψp, δ2, that should be estimated based on the available data on the studied process.

After the unknown parameters have been estimated, the user can forecast the analyzed series. The forecast is executed recursively using difference equation:

consecutively supposing that i=1,2,…,T and replacing Xj for t+i with their forecasts on the previous steps (where T – the lead period).

The aforementioned equation can also describe non-stationary processes.

Stationary and Non-stationary Processes

The process Xt is stationary, if all the roots of the polynomial ψ(z) = zp + ψ1zp-1 + … + ψp lie within a unit circle |z|<1. Then µ - the average, around which the process varies.

Otherwise, Xt is non-stationary and µ serves only as a reference point for the process level.

Using Autoregression for Residuals Modeling

If additional assumptions are weak, the stationary process satisfies autoregression equation of the infinite order with sufficiently rapidly decreasing coefficients. That is why autoregression model of a sufficiently high order can properly approximate almost any stationary process.

As a result, this autoregression model is often used to model residuals of a parametric model; for example, residuals of the regression model or of the trend model.

For autoregression process of the order p theoretical values of the partial autocorrelation function for the lags greater than p, are equal to zero. Using this property, the user can select the order of autoregression model to describe selected data. As a rule, p ≤ 2 is enough to solve most common practical problems.

See also:

Library of Methods and Models