Third Year Candidacy Presentation: Methodological Contributions to Change Point Detection and Synthetic Data Analysis
This presentation highlights two ongoing research projects in modern statistics. The first one is a nonparametric framework for online distribution change detection based on neural network estimation of the log-density ratio and Kullback–Leibler divergence. Sequential observations are monitored to determine whether the underlying distribution remains stable or changes at an unknown time. The proposed method reformulates change detection as a classification problem and uses sparse ReLU neural networks to construct a test statistic for detecting distributional shifts. Theoretical results establish false alarm control, high-probability detection after a change, and finite-sample guarantees under weak temporal dependence conditions.
The second project studies statistical inference using synthetic data generated by perturbing observed covariates and responses. Focusing on regression settings, this work develops moment-based corrections for bias introduced by additive noise and establishes asymptotic normality, and valid confidence intervals for target parameters of the original data-generating process.
Together, these projects address two central challenges in statistics: detecting changes in evolving data streams and drawing reliable conclusions from synthetic data.
Advisors: Carlos Madrid Padilla and Xuming He