Regression procedures for ARMA estimation

E. J. Hannan, Andrew McDougall

Research output: Contribution to journalArticleResearchpeer-review

14 Citations (Scopus)

Abstract

Regression procedures for parameter estimation in autoregression moving average (ARMA) models are discussed, mainly for providing initial estimates for iterative maximization of a Gaussian likelihood. An iterative procedure of Spliid (1983) is compared to a procedure of Hannan and Rissanen (1982), and a global convergence result is established for an iterative modification of Spliid’s procedure. Spliid’s iteration does not always converge; when it does, it has the same asymptotic distribution as the second stage of the Hannan-Rissanen procedure. This second-stage iteration gives the same problems as Spliid’s procedure, so it is preferable to go immediately to the third stage, which is asymptotically efficient in the Gaussian case. An example is provided by a first-order ARMA model, y(t) + αy(t − 1) = ε(t) + βε(t −1), under the usual regularity conditions of stationarity and invertibility. Spliid’s procedure fits an autoregression of predetermined order (i.e., 2) to obtain estimates (Equation presented) of the ε(t), and then regresses y(t) on y(t − 1) and (Equation presented) to obtain estimates (Equation presented) and (Equation presented). These estimates are used to filter the data to obtain a further estimate (Equation presented); the procedure is iterated until the sequence of estimates of α and β converges. Convergence will not take place unless αβ < ½. If the order of the autoregression in Spliid’s procedure is chosen by the information criterion and only the first regression on y(t − 1) and (Equation presented) is carried out, the first two stages of the Hannan-Rissanen procedure result. Then (Equation presented) and (Equation presented) converge strongly as T → ∞ to α and β, and they also obey a central limit theorem. The modified Spliid procedure replaces (Equation presented) by the residuals from the first regression, and so on. In a restricted sense, this iteration always converges in the ARMA case, and will generally do so if the positive real condition for the moving average transfer function is satisfied [see (1.5) in Sec. 1]. Simulations bear out the theory. In practice, there will be no true ARMA system; in any case, the ARMA orders will need to be determined from the data. Thus the results are of suggestive value only. Nevertheless, these suggestions favor the Hannan-Rissanen procedure.

Original languageEnglish
Pages (from-to)490-498
Number of pages9
JournalJournal of the American Statistical Association
Volume83
Issue number402
DOIs
StatePublished - 1 Jan 1988

Fingerprint

Autoregression
Moving Average
Regression
Estimate
Converge
Moving Average Model
Iteration
Moving average
Information Criterion
Invertibility
Stationarity
Iterative Procedure
Regularity Conditions
Global Convergence
Central limit theorem
Convergence Results
Transfer Function
Asymptotic distribution
Immediately
Parameter Estimation

Keywords

  • Autoregression–regression
  • Iterative estimation
  • Liapounoff function
  • Positive real condition
  • Stationary process

Cite this

@article{a58ae546ba12424795473c8e91c01c22,
title = "Regression procedures for ARMA estimation",
abstract = "Regression procedures for parameter estimation in autoregression moving average (ARMA) models are discussed, mainly for providing initial estimates for iterative maximization of a Gaussian likelihood. An iterative procedure of Spliid (1983) is compared to a procedure of Hannan and Rissanen (1982), and a global convergence result is established for an iterative modification of Spliid’s procedure. Spliid’s iteration does not always converge; when it does, it has the same asymptotic distribution as the second stage of the Hannan-Rissanen procedure. This second-stage iteration gives the same problems as Spliid’s procedure, so it is preferable to go immediately to the third stage, which is asymptotically efficient in the Gaussian case. An example is provided by a first-order ARMA model, y(t) + αy(t − 1) = ε(t) + βε(t −1), under the usual regularity conditions of stationarity and invertibility. Spliid’s procedure fits an autoregression of predetermined order (i.e., 2) to obtain estimates (Equation presented) of the ε(t), and then regresses y(t) on y(t − 1) and (Equation presented) to obtain estimates (Equation presented) and (Equation presented). These estimates are used to filter the data to obtain a further estimate (Equation presented); the procedure is iterated until the sequence of estimates of α and β converges. Convergence will not take place unless αβ < ½. If the order of the autoregression in Spliid’s procedure is chosen by the information criterion and only the first regression on y(t − 1) and (Equation presented) is carried out, the first two stages of the Hannan-Rissanen procedure result. Then (Equation presented) and (Equation presented) converge strongly as T → ∞ to α and β, and they also obey a central limit theorem. The modified Spliid procedure replaces (Equation presented) by the residuals from the first regression, and so on. In a restricted sense, this iteration always converges in the ARMA case, and will generally do so if the positive real condition for the moving average transfer function is satisfied [see (1.5) in Sec. 1]. Simulations bear out the theory. In practice, there will be no true ARMA system; in any case, the ARMA orders will need to be determined from the data. Thus the results are of suggestive value only. Nevertheless, these suggestions favor the Hannan-Rissanen procedure.",
keywords = "Autoregression–regression, Iterative estimation, Liapounoff function, Positive real condition, Stationary process",
author = "Hannan, {E. J.} and Andrew McDougall",
year = "1988",
month = "1",
day = "1",
doi = "10.1080/01621459.1988.10478622",
language = "English",
volume = "83",
pages = "490--498",
journal = "Journal of the American Statistical Association",
issn = "0162-1459",
publisher = "Taylor and Francis Ltd.",
number = "402",

}

Regression procedures for ARMA estimation. / Hannan, E. J.; McDougall, Andrew.

In: Journal of the American Statistical Association, Vol. 83, No. 402, 01.01.1988, p. 490-498.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - Regression procedures for ARMA estimation

AU - Hannan, E. J.

AU - McDougall, Andrew

PY - 1988/1/1

Y1 - 1988/1/1

N2 - Regression procedures for parameter estimation in autoregression moving average (ARMA) models are discussed, mainly for providing initial estimates for iterative maximization of a Gaussian likelihood. An iterative procedure of Spliid (1983) is compared to a procedure of Hannan and Rissanen (1982), and a global convergence result is established for an iterative modification of Spliid’s procedure. Spliid’s iteration does not always converge; when it does, it has the same asymptotic distribution as the second stage of the Hannan-Rissanen procedure. This second-stage iteration gives the same problems as Spliid’s procedure, so it is preferable to go immediately to the third stage, which is asymptotically efficient in the Gaussian case. An example is provided by a first-order ARMA model, y(t) + αy(t − 1) = ε(t) + βε(t −1), under the usual regularity conditions of stationarity and invertibility. Spliid’s procedure fits an autoregression of predetermined order (i.e., 2) to obtain estimates (Equation presented) of the ε(t), and then regresses y(t) on y(t − 1) and (Equation presented) to obtain estimates (Equation presented) and (Equation presented). These estimates are used to filter the data to obtain a further estimate (Equation presented); the procedure is iterated until the sequence of estimates of α and β converges. Convergence will not take place unless αβ < ½. If the order of the autoregression in Spliid’s procedure is chosen by the information criterion and only the first regression on y(t − 1) and (Equation presented) is carried out, the first two stages of the Hannan-Rissanen procedure result. Then (Equation presented) and (Equation presented) converge strongly as T → ∞ to α and β, and they also obey a central limit theorem. The modified Spliid procedure replaces (Equation presented) by the residuals from the first regression, and so on. In a restricted sense, this iteration always converges in the ARMA case, and will generally do so if the positive real condition for the moving average transfer function is satisfied [see (1.5) in Sec. 1]. Simulations bear out the theory. In practice, there will be no true ARMA system; in any case, the ARMA orders will need to be determined from the data. Thus the results are of suggestive value only. Nevertheless, these suggestions favor the Hannan-Rissanen procedure.

AB - Regression procedures for parameter estimation in autoregression moving average (ARMA) models are discussed, mainly for providing initial estimates for iterative maximization of a Gaussian likelihood. An iterative procedure of Spliid (1983) is compared to a procedure of Hannan and Rissanen (1982), and a global convergence result is established for an iterative modification of Spliid’s procedure. Spliid’s iteration does not always converge; when it does, it has the same asymptotic distribution as the second stage of the Hannan-Rissanen procedure. This second-stage iteration gives the same problems as Spliid’s procedure, so it is preferable to go immediately to the third stage, which is asymptotically efficient in the Gaussian case. An example is provided by a first-order ARMA model, y(t) + αy(t − 1) = ε(t) + βε(t −1), under the usual regularity conditions of stationarity and invertibility. Spliid’s procedure fits an autoregression of predetermined order (i.e., 2) to obtain estimates (Equation presented) of the ε(t), and then regresses y(t) on y(t − 1) and (Equation presented) to obtain estimates (Equation presented) and (Equation presented). These estimates are used to filter the data to obtain a further estimate (Equation presented); the procedure is iterated until the sequence of estimates of α and β converges. Convergence will not take place unless αβ < ½. If the order of the autoregression in Spliid’s procedure is chosen by the information criterion and only the first regression on y(t − 1) and (Equation presented) is carried out, the first two stages of the Hannan-Rissanen procedure result. Then (Equation presented) and (Equation presented) converge strongly as T → ∞ to α and β, and they also obey a central limit theorem. The modified Spliid procedure replaces (Equation presented) by the residuals from the first regression, and so on. In a restricted sense, this iteration always converges in the ARMA case, and will generally do so if the positive real condition for the moving average transfer function is satisfied [see (1.5) in Sec. 1]. Simulations bear out the theory. In practice, there will be no true ARMA system; in any case, the ARMA orders will need to be determined from the data. Thus the results are of suggestive value only. Nevertheless, these suggestions favor the Hannan-Rissanen procedure.

KW - Autoregression–regression

KW - Iterative estimation

KW - Liapounoff function

KW - Positive real condition

KW - Stationary process

UR - http://www.scopus.com/inward/record.url?scp=11244252612&partnerID=8YFLogxK

U2 - 10.1080/01621459.1988.10478622

DO - 10.1080/01621459.1988.10478622

M3 - Article

VL - 83

SP - 490

EP - 498

JO - Journal of the American Statistical Association

JF - Journal of the American Statistical Association

SN - 0162-1459

IS - 402

ER -