### Abstract

Regression procedures for parameter estimation in autoregression moving average (ARMA) models are discussed, mainly for providing initial estimates for iterative maximization of a Gaussian likelihood. An iterative procedure of Spliid (1983) is compared to a procedure of Hannan and Rissanen (1982), and a global convergence result is established for an iterative modification of Spliid’s procedure. Spliid’s iteration does not always converge; when it does, it has the same asymptotic distribution as the second stage of the Hannan-Rissanen procedure. This second-stage iteration gives the same problems as Spliid’s procedure, so it is preferable to go immediately to the third stage, which is asymptotically efficient in the Gaussian case. An example is provided by a first-order ARMA model, y(t) + αy(t − 1) = ε(t) + βε(t −1), under the usual regularity conditions of stationarity and invertibility. Spliid’s procedure fits an autoregression of predetermined order (i.e., 2) to obtain estimates (Equation presented) of the ε(t), and then regresses y(t) on y(t − 1) and (Equation presented) to obtain estimates (Equation presented) and (Equation presented). These estimates are used to filter the data to obtain a further estimate (Equation presented); the procedure is iterated until the sequence of estimates of α and β converges. Convergence will not take place unless αβ < ½. If the order of the autoregression in Spliid’s procedure is chosen by the information criterion and only the first regression on y(t − 1) and (Equation presented) is carried out, the first two stages of the Hannan-Rissanen procedure result. Then (Equation presented) and (Equation presented) converge strongly as T → ∞ to α and β, and they also obey a central limit theorem. The modified Spliid procedure replaces (Equation presented) by the residuals from the first regression, and so on. In a restricted sense, this iteration always converges in the ARMA case, and will generally do so if the positive real condition for the moving average transfer function is satisfied [see (1.5) in Sec. 1]. Simulations bear out the theory. In practice, there will be no true ARMA system; in any case, the ARMA orders will need to be determined from the data. Thus the results are of suggestive value only. Nevertheless, these suggestions favor the Hannan-Rissanen procedure.

Original language | English |
---|---|

Pages (from-to) | 490-498 |

Number of pages | 9 |

Journal | Journal of the American Statistical Association |

Volume | 83 |

Issue number | 402 |

DOIs | |

State | Published - 1 Jan 1988 |

### Fingerprint

### Keywords

- Autoregression–regression
- Iterative estimation
- Liapounoff function
- Positive real condition
- Stationary process

### Cite this

*Journal of the American Statistical Association*,

*83*(402), 490-498. https://doi.org/10.1080/01621459.1988.10478622

}

*Journal of the American Statistical Association*, vol. 83, no. 402, pp. 490-498. https://doi.org/10.1080/01621459.1988.10478622

**Regression procedures for ARMA estimation.** / Hannan, E. J.; McDougall, Andrew.

Research output: Contribution to journal › Article

TY - JOUR

T1 - Regression procedures for ARMA estimation

AU - Hannan, E. J.

AU - McDougall, Andrew

PY - 1988/1/1

Y1 - 1988/1/1

N2 - Regression procedures for parameter estimation in autoregression moving average (ARMA) models are discussed, mainly for providing initial estimates for iterative maximization of a Gaussian likelihood. An iterative procedure of Spliid (1983) is compared to a procedure of Hannan and Rissanen (1982), and a global convergence result is established for an iterative modification of Spliid’s procedure. Spliid’s iteration does not always converge; when it does, it has the same asymptotic distribution as the second stage of the Hannan-Rissanen procedure. This second-stage iteration gives the same problems as Spliid’s procedure, so it is preferable to go immediately to the third stage, which is asymptotically efficient in the Gaussian case. An example is provided by a first-order ARMA model, y(t) + αy(t − 1) = ε(t) + βε(t −1), under the usual regularity conditions of stationarity and invertibility. Spliid’s procedure fits an autoregression of predetermined order (i.e., 2) to obtain estimates (Equation presented) of the ε(t), and then regresses y(t) on y(t − 1) and (Equation presented) to obtain estimates (Equation presented) and (Equation presented). These estimates are used to filter the data to obtain a further estimate (Equation presented); the procedure is iterated until the sequence of estimates of α and β converges. Convergence will not take place unless αβ < ½. If the order of the autoregression in Spliid’s procedure is chosen by the information criterion and only the first regression on y(t − 1) and (Equation presented) is carried out, the first two stages of the Hannan-Rissanen procedure result. Then (Equation presented) and (Equation presented) converge strongly as T → ∞ to α and β, and they also obey a central limit theorem. The modified Spliid procedure replaces (Equation presented) by the residuals from the first regression, and so on. In a restricted sense, this iteration always converges in the ARMA case, and will generally do so if the positive real condition for the moving average transfer function is satisfied [see (1.5) in Sec. 1]. Simulations bear out the theory. In practice, there will be no true ARMA system; in any case, the ARMA orders will need to be determined from the data. Thus the results are of suggestive value only. Nevertheless, these suggestions favor the Hannan-Rissanen procedure.

AB - Regression procedures for parameter estimation in autoregression moving average (ARMA) models are discussed, mainly for providing initial estimates for iterative maximization of a Gaussian likelihood. An iterative procedure of Spliid (1983) is compared to a procedure of Hannan and Rissanen (1982), and a global convergence result is established for an iterative modification of Spliid’s procedure. Spliid’s iteration does not always converge; when it does, it has the same asymptotic distribution as the second stage of the Hannan-Rissanen procedure. This second-stage iteration gives the same problems as Spliid’s procedure, so it is preferable to go immediately to the third stage, which is asymptotically efficient in the Gaussian case. An example is provided by a first-order ARMA model, y(t) + αy(t − 1) = ε(t) + βε(t −1), under the usual regularity conditions of stationarity and invertibility. Spliid’s procedure fits an autoregression of predetermined order (i.e., 2) to obtain estimates (Equation presented) of the ε(t), and then regresses y(t) on y(t − 1) and (Equation presented) to obtain estimates (Equation presented) and (Equation presented). These estimates are used to filter the data to obtain a further estimate (Equation presented); the procedure is iterated until the sequence of estimates of α and β converges. Convergence will not take place unless αβ < ½. If the order of the autoregression in Spliid’s procedure is chosen by the information criterion and only the first regression on y(t − 1) and (Equation presented) is carried out, the first two stages of the Hannan-Rissanen procedure result. Then (Equation presented) and (Equation presented) converge strongly as T → ∞ to α and β, and they also obey a central limit theorem. The modified Spliid procedure replaces (Equation presented) by the residuals from the first regression, and so on. In a restricted sense, this iteration always converges in the ARMA case, and will generally do so if the positive real condition for the moving average transfer function is satisfied [see (1.5) in Sec. 1]. Simulations bear out the theory. In practice, there will be no true ARMA system; in any case, the ARMA orders will need to be determined from the data. Thus the results are of suggestive value only. Nevertheless, these suggestions favor the Hannan-Rissanen procedure.

KW - Autoregression–regression

KW - Iterative estimation

KW - Liapounoff function

KW - Positive real condition

KW - Stationary process

UR - http://www.scopus.com/inward/record.url?scp=11244252612&partnerID=8YFLogxK

U2 - 10.1080/01621459.1988.10478622

DO - 10.1080/01621459.1988.10478622

M3 - Article

AN - SCOPUS:11244252612

VL - 83

SP - 490

EP - 498

JO - Journal of the American Statistical Association

JF - Journal of the American Statistical Association

SN - 0162-1459

IS - 402

ER -