This paper studies the quasi-maximum likelihood estimator (QMLE) for the generalized autoregressive conditional heteroscedastic (GARCH) model based on the Laplace (1,1) residuals. The QMLE is proposed to the parameter vector of the GARCH model with the Laplace (1,1) firstly. Under some certain conditions, the strong consistency and asymptotic normality of QMLE are then established. In what follows, a real example with Laplace and normal distribution is analyzed to evaluate the performance of the QMLE and some comparison results on the performance are given. In the end the proofs of some theorem are presented.
2
Dostęp do pełnego tekstu na zewnętrznej witrynie WWW
We establish rates of convergences in statistical learning for time series forecasting. Using the PAC-Bayesian approach, slow rates of convergence √ d/n for the Gibbs estimator under the absolute loss were given in a previous work [7], where n is the sample size and d the dimension of the set of predictors. Under the same weak dependence conditions, we extend this result to any convex Lipschitz loss function. We also identify a condition on the parameter space that ensures similar rates for the classical penalized ERM procedure. We apply this method for quantile forecasting of the French GDP. Under additional conditions on the loss functions (satisfied by the quadratic loss function) and for uniformly mixing processes, we prove that the Gibbs estimator actually achieves fast rates of convergence d/n. We discuss the optimality of these different rates pointing out references to lower bounds when they are available. In particular, these results bring a generalization the results of [29] on sparse regression estimation to some autoregression.
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.