dc.contributor.author Wintenberger, Olivier dc.contributor.author Li, Xiaoyin dc.contributor.author Alquier, Pierre dc.date.accessioned 2012-11-15T09:08:59Z dc.date.available 2012-11-15T09:08:59Z dc.date.issued 2013 dc.identifier.uri https://basepub.dauphine.fr/handle/123456789/10570 dc.language.iso en en dc.subject GDP Forecasting en dc.subject fast rates en dc.subject oracle inequalities en dc.subject mixing en dc.subject weak-dependence en dc.subject PAC-Bayesian bounds en dc.subject Time series forecasting en dc.subject Statistical learning theory en dc.subject.ddc 519 en dc.title Prediction of time series by statistical learning: general losses and fast rates en dc.type Article accepté pour publication ou publié dc.contributor.editoruniversityother Laboratoire de Finance Assurance (LFA) http://www.crest.fr/content/view/41/100/ Centre de Recherche en Économie et STatistique (CREST);France dc.contributor.editoruniversityother Laboratoire d'Analyse, Géométrie et Modélisation (AGM) http://www.u-cergy.fr/rech/agm CNRS : UMR8088 – Université de Cergy Pontoise;France dc.contributor.editoruniversityother UCD - School of Mathematical Sciences http://www.ucd.ie/mathsciences/ University College Dublin;Irlande dc.description.abstracten We establish rates of convergences in time series forecasting using the statistical learning approach based on oracle inequalities. A series of papers extends the oracle inequalities obtained for iid observations to time series under weak dependence conditions. Given a family of predictors and $n$ observations, oracle inequalities state that a predictor forecasts the series as well as the best predictor in the family up to a remainder term $\Delta_n$. Using the PAC-Bayesian approach, we establish under weak dependence conditions oracle inequalities with optimal rates of convergence. We extend previous results for the absolute loss function to any Lipschitz loss function with rates $\Delta_n\sim\sqrt{ c(\Theta)/ n}$ where $c(\Theta)$ measures the complexity of the model. We apply the method for quantile loss functions to forecast the french GDP. Under additional conditions on the loss functions (satisfied by the quadratic loss function) and on the time series, we refine the rates of convergence to $\Delta_n \sim c(\Theta)/n$. We achieve for the first time these fast rates for uniformly mixing processes. These rates are known to be optimal in the iid case and for individual sequences. In particular, we generalize the results of Dalalyan and Tsybakov on sparse regression estimation to the case of autoregression. en dc.relation.isversionofjnlname Dependence Modeling dc.relation.isversionofjnlvol 1 dc.relation.isversionofjnlpages 65-93 dc.relation.isversionofdoi http://dx.doi.org/10.2478/demo-2013-0004 dc.identifier.urlsite http://hal.archives-ouvertes.fr/hal-00749729 en dc.relation.isversionofjnlpublisher De Gruyter dc.subject.ddclabel Probabilités et mathématiques appliquées en dc.description.submitted non en
﻿

## Files in this item

FilesSizeFormatView

There are no files associated with this item.