Surface of Maximums of AR(2) Process Spectral Densities and its Application in Time Series Statistics

Alexander V. Ivanov, Nataliia M. Karpova

Abstract


Background. In the problem on probabilities of large deviations of discrete time and sub-Gaussian AR(2) noise non­li­near regression model parameter least squares estimate a constant is determined that controls the rate of exponential convergence to zero of indicated probabilities.

Objective. The aim of the paper is to find the surface of maximums of AR(2) process spectral densities in the domain of its stationarity in explicit form.

Methods. The results were obtained on the use of methodology developed in the works by A. Sieders, K. Dzhaparidze (1987), A.V. Ivanov (1997, 2016) and standard Calculus methods.

Results. A complex formula that describes a continuous surface of maximums of AR(2) process spectral densities assig­ned on the stationary triangle of the time series of this type is obtained.

Conclusions. The obtained formula of surface of maximums of noise spectral densities gives an opportunity to realize for which values of AR(2) process characteristic polynomial coefficients it is possible to look for greater rate of convergence to zero of the probabilities of large deviations of the considered estimates.

Keywords


Nonlinear regression model; Least squares estimate; Sub-Gaussian white noise; AR(2) process; Probabilities of large deviations; Surface of maximums of spectral densities

References


A.V. Ivanov, “An asymptotic expansion for the distribution of the least squares estimator of the non-linear regression parameter”, Theory Probab. Appl., vol. 21, no. 3, pp. 557–570, 1977. doi: 10.1137/1121067

B.L.S. Prakasa Rao, “On the exponential rate of convergence of the least squares estimator in the nonlinear regression model with Gaussian errors”, Statist. Probab. Lett., vol. 2, pp. 139–142, 1984. doi: 10.1016/0167-7152(84)90004-X

A. Sieders and K. Dzhaparidze, “A large deviation result for parameter estimators and its application to nonlinear regression ana­lysis”, Ann. Statist., vol. 15, no. 3, pp. 1031–1049, 1987. doi: 10.1214/aos/1176350491

I.A. Ibragimov and R.Z. Has’minskii, Statistical Estimation: Asymptotic Theory. New York: Springer, 1981.

A.V. Ivanov, Asymptotic Theory of Nonlinear Regression. Dordrecht, Boston, London: Kluwer AP, 1997.

A.V. Ivanov and N.N. Leonenko, Statistical Analysis of Random Fields. Dordrecht, Boston, London: Kluwer AP, 1989.

B.L.S. Pracasa Rao, “The rate of convergence for the least squares estimator in a non­-linear regression model with dependent errors”, J. Multivariate Analysis, vol. 14, no. 3, pp. 315–322, 1984. doi: 10.1016/0047-259X(84)90036-8

S.H. Hu, “A large deviation result for the least squares estimators in nonlinear regression”, Stochastic Process and their Applications, vol. 47, pp. 345–352, 1993. doi: 10.1016/0304-4149(93)90022-V

W.Z. Yang and S.H. Hu, “Large deviation for a least squares estimator in a non-linear regression model”, Stat. Probab. Lett., vol. 91, pp. 135–144, 2014. doi: 10.1016/j.spl.2014.04.022

A.V. Ivanov, “Large deviations of regression parameter estimate in the models with stationary sub-Gaussian noise”, Theor. Pro­bab. Math. Stat., vol. 95, pp. 92–100, 2016.

P.J. Brockwell and R.A. Davis, Introduction to Time Series and Forecasting, 2nd ed. New York: Springer, 2002.

V.V. Buldygin and Yu.V. Kozachenko, Metric Characterization of Random Variables and Random Processes. Providence: AMS, 2000.


GOST Style Citations


 

 





DOI: https://doi.org/10.20535/1810-0546.2017.4.106224

Refbacks

  • There are currently no refbacks.




Copyright (c) 2017 Igor Sikorsky Kyiv Polytechnic Institute

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.