Thesis Oral Defense: Inference for Time Series in Change Points and Statistical Learning

Speaker: Jiaqi Li, Washington University in St. Louis

Abstract: This study aims to establish a comprehensive framework for inference theory in time series analysis, with a focus on change-point detection and online learning algorithms. First, we introduce an L^2-based inference approach for multiple change-point detection in high-dimensional time series, which targets dense or spatially clustered signals. We propose a novel Two-Way MOSUM (moving sum) test statistic that leverages spatial-temporal moving regions to identify breaks, with enhanced testing power when breaks only occur in a few groups. Furthermore, we derive the limiting distribution of this L^2-aggregated statistic by extending the high-dimensional Gaussian approximation to non-stationary spatial-temporal processes. Simulation exhibits promising performance of our test in detecting non-sparse weak signals, and the application on COVID-19 cases in the United States shows the real-world relevance of our method. Next, we revisit the Stochastic Gradient Descent (SGD) algorithm from a nonlinear time series perspective. By introducing the functional dependence measure to the machine learning field, we investigate the geometric-moment contraction property of SGD. This property effectively addresses the challenges posed by non-stationarity in recursive iterations. Subsequently, we establish the asymptotic normality for averaged SGD and propose an online estimator for the long-run covariance matrix in the limiting distribution. Numerical experiments demonstrate that our proposed empirical confidence intervals exhibit asymptotically precise coverage probabilities. 

Advisors: Todd Kuffner and Likai Chen