Abstract
A computationally efficient branch-and-bound strategy for finding the
subsets of the most statistically significant variables of a vector
autoregressive (VAR) model from a given search subspace is proposed.
Specifically, the candidate submodels are obtained by deleting columns
from the coefficient matrices of the full-specified VAR process. The
strategy is based on a regression tree and derives the best-subset VAR
models without computing the whole tree. The branch-and-bound cutting
test is based on monotone statistical selection criteria which are
functions of the determinant of the estimated residual covariance
matrix. Experimental results confirm the computational efficiency of the
proposed algorithm.
Original language | English |
---|---|
Pages (from-to) | 1949-1963 |
Journal | Journal of Economic Dynamics and Control |
Volume | 32 |
Issue number | 6 |
DOIs | |
Publication status | Published - 2008 |
MoE publication type | A1 Journal article-refereed |
Keywords
- vector autoregressive model
- model selection
- branch-and-bound algorithms