Show simple item record

dc.contributor.authorWintenberger, Olivier
dc.contributor.authorLi, Xiaoyin
dc.contributor.authorAlquier, Pierre
dc.subjectGDP Forecastingen
dc.subjectfast ratesen
dc.subjectoracle inequalitiesen
dc.subjectPAC-Bayesian boundsen
dc.subjectTime series forecastingen
dc.subjectStatistical learning theoryen
dc.titlePrediction of time series by statistical learning: general losses and fast ratesen
dc.typeArticle accepté pour publication ou publié
dc.contributor.editoruniversityotherLaboratoire de Finance Assurance (LFA) Centre de Recherche en Économie et STatistique (CREST);France
dc.contributor.editoruniversityotherLaboratoire d'Analyse, Géométrie et Modélisation (AGM) CNRS : UMR8088 – Université de Cergy Pontoise;France
dc.contributor.editoruniversityotherUCD - School of Mathematical Sciences University College Dublin;Irlande
dc.description.abstractenWe establish rates of convergences in time series forecasting using the statistical learning approach based on oracle inequalities. A series of papers extends the oracle inequalities obtained for iid observations to time series under weak dependence conditions. Given a family of predictors and $n$ observations, oracle inequalities state that a predictor forecasts the series as well as the best predictor in the family up to a remainder term $\Delta_n$. Using the PAC-Bayesian approach, we establish under weak dependence conditions oracle inequalities with optimal rates of convergence. We extend previous results for the absolute loss function to any Lipschitz loss function with rates $\Delta_n\sim\sqrt{ c(\Theta)/ n}$ where $c(\Theta)$ measures the complexity of the model. We apply the method for quantile loss functions to forecast the french GDP. Under additional conditions on the loss functions (satisfied by the quadratic loss function) and on the time series, we refine the rates of convergence to $\Delta_n \sim c(\Theta)/n$. We achieve for the first time these fast rates for uniformly mixing processes. These rates are known to be optimal in the iid case and for individual sequences. In particular, we generalize the results of Dalalyan and Tsybakov on sparse regression estimation to the case of autoregression.en
dc.relation.isversionofjnlnameDependence Modeling
dc.relation.isversionofjnlpublisherDe Gruyter
dc.subject.ddclabelProbabilités et mathématiques appliquéesen

Files in this item


There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record