In general, the model looks as follows:
t = 1 … T
Where:
T. Number of observations.
βk. k =1 … n. Estimated coefficients of explanatory variables.
xk. k =1 … n. Explanatory variables.
et. Residuals.
yt. Explained variable.
In the matrix form the model can be written as: Y = Xβ + C + ε.
If the constant C is defined (zero or non-zero value), the model can be converted by substituting Y → Y + C to the classic view: Y =Xβ + ε.
If constant must be estimated, then on adding additional artificial variable with the 1 value in all observations and, correspondingly, generating the X advanced matrix by adding a single column to the X matrix, we cast the model to the classic view: Y =Xβ + ε.
Suppose that we replace n → n + 1. To estimate the coefficients β or β = (β, C), use OLS or singular decomposition technique.
The case of multicollinearity is considered separately, when the matrix X'X, or respectively X'X is near singular (the absolute value of the determinant is small). In this case, the coefficients estimate is ambiguous, as the columns of the matrix X or X are linearly dependent. To get an unambiguous estimate, exclude the columns from the matrix X until it (or the corresponding matrix X) has the maximum rank.
Additional characteristics of the model. Determination coefficient:
Where:
y* = τYi
e = Y - Ŷ.
.
i. Unit column.
τ = 1 if the constant is estimated automatically, τ = 0 if the constant is estimated manually.
Adjusted determination coefficient (is undefined when T = n):
Value of the Fisher statistics:
See also: