Abstract: Serial dependence and predictability are two sides of the same coin. The literature has considered alternative measures of these two fundamental concepts. In this paper we aim at distilling the most predictable aspect of a univariate time series, i.e., the one for which predictability is optimized. Our target measure is the mutual information between past and future of a random process, a broad measure of predictability that takes into account all future forecast horizons, rather than focusing on the one-step-ahead prediction error mean square error.
The first most predictable aspect is defined as the measurable transformation of the series for which the mutual information between past and future is a maximum. The proposed transformation arises from of the linear combination of a set of basis functions localized at the quantiles of the unconditional distribution of the process. The mutual information is estimated as a function of the sample partial autocorrelations, by a semiparametric method which estimates an infinite sum by a regularized finite sum. The second most predictable aspect can also be defined, subject to suitable orthogonality restrictions. We also investigate the use of the most predictable aspect for testing the null of no predictability.
Link to work
Presentation slides