Representative Work Series 1: A Series of New Methods for High-Dimensional Covariate Screening and Data Dimensionality Reduction

Chang, Tang & Wu (2013, AoS; 2016, AoS) first introduced a method for selecting ultra-high-dimensional covariates using marginal hypothesis testing, effectively addressing the limitations of existing methods and significantly reducing the data and model requirements and constraints imposed by other methods. They proposed using the value of the empirical likelihood ratio at 0 as a test statistic to measure whether the marginal contribution of each covariate is zero, which avoids the identification problems that may arise when directly estimating marginal contributions. Furthermore, based on the characteristics of empirical likelihood self-normalization, this test statistic can avoid the impact of heteroscedasticity. Chang, Guo & Yao (2015, JoE) proposed a fast factor dimensionality reduction method by spectral decomposition of a positive-definite matrix, avoiding the need to solve ultra-high-dimensional optimization problems directly and overcoming the computational bottleneck faced by traditional methods. Even when the observed data dimensions reach thousands, this method can perform dimensionality reduction in just a few seconds on a personal laptop. Chang, Guo & Yao (2018, AoS) proposed a dimensionality reduction method based on linear transformations. They transformed the observed data into a new set of data through linear transformations, where the components in the new data can be grouped, and there is no correlation between the groups. This effectively avoided the two main problems of “overparameterization” and “model non-identifiability” that arise when directly modeling. In practice, even when such linear transformations do not exist, forcing the dimensionality reduction and subsequent grouping for modeling still significantly improved predictive accuracy. Chang, He, Yang & Yao (2023, JRSSB) proposed a dimensionality reduction method for complex matrix-type time series through tensor CP decomposition and introduced a fast algorithm that does not require iteration to complete the solution. This improved upon the common approach in the literature, which typically relies on iterative algorithms for CP decomposition. Chang, Fang, Qiao, & Yao (2024+, JASA) propose a two-step procedure for modeling and forecasting high-dimensional functional time series. This method first performs eigenanalysis to achieve data transformation and grouping, ensuring uncorrelatedness between distinct groups. Subsequently, finite-dimensional vector time series models are established within each group, thereby effectively reducing dimensionality while preserving the dynamic structure.

Representative Work Series 2: Unified Methodological System for Estimation and Inference of Ultra-High-Dimensional Models

Chang, Chen & Chen (2015, JoE) provided a method for estimating and making inferences about the divergent estimation equations for 𝑟 and 𝑝 using empirical likelihood. They demonstrated that, like other estimation methods such as the generalized method of moments (GMM), empirical likelihood estimation can only work when 𝑟 and 𝑝 diverge at a very slow rate. Chang, Tang & Wu (2018, AoS) introduced a system to solve the parameter estimation problem of the estimating equations when 𝑟 and 𝑝 are much larger than 𝑛 by incorporating penalties on both the estimated parameters and Lagrange multipliers in the loss function of empirical likelihood. They established a unified methodology for solving ultra-high-dimensional estimation equations. Chang, Chen, Tang & Wu (2021, BKA) developed a unified statistical inference method that is rotation-based and does not rely on bias correction. This method systematically addressed the inference problem for ultra-high-dimensional estimating equations and, for the first time, provided an over-identification test for these equations. Chang, Shi & Zhang (2023, JBES) systematically studied the parameter estimation and inference issues in high-dimensional moment constraint models under the possibility of misidentified moment conditions. They proposed a penalized empirical likelihood method and established corresponding criteria for moment condition identification. Chang, Tang & Zhu (2025+, JRSSB) introduce Bayesian sampling theory to propose two classes of sampling algorithms for solving ultra-high-dimensional estimating equations. These algorithms can locate the global optimum with high precision without relying on the choice of initial values, thereby systematically addressing the computational challenges inherent in solving ultra-high-dimensional estimating equations.

Representative Work Series 3: New Theories of Statistical Inference Based on Gaussian Approximation

Chang, Yao & Zhou (2017, BKA) solved the white noise testing problem in ultra-high-dimensional time series for the first time. Chang, Jiang & Shao (2023, JoE) extended the method of Chang, Yao & Zhou (2017, BKA) and for the first time solved the more general problem of ultra-high-dimensional martingale difference testing. Chang, Zheng, Zhou & Zhou (2017, Biometrics) and Chang, Zhou, Zhou & Wang (2017, Biometrics) provided ultra-high-dimensional mean and covariance testing methods that work when there are arbitrary correlation structures among the components within the data. Chang, Qiu, Yao & Zou (2018, JoE) developed a method for constructing confidence regions for ultra-high-dimensional precision matrices, and used this method to study changes in the connectivity between different sectors of the U.S. stock market before and after the 2008 financial crisis. Chang, He, Kang & Wu (2024, JASA) proposed a fast inference method using a parametric bootstrap for the dependency structure in multimodal brain imaging data. This method does not require assumptions about the correlations between different brain regions in multimodal brain imaging data. Using this method to analyze the multi-task fMRI data from the Human Connectome Project, they discovered new conclusions in the field of brain science. Chang, Chen & Wu (2024, Bernoulli) establish ultra-high-dimensional central limit theorems under three distinct dependence frameworks: alpha-mixing, m-dependence, and physical dependence. These results hold uniformly over hyperrectangles, simple convex sets, and sparse convex sets. Furthermore, the article proposes a parametric bootstrap procedure based on kernel-function estimators to address the practical challenge of unknown long-run covariance. Chang, Hu, Kolaczyk, Yao & Yi (2024, AoS) propose a novel method based on jittering mechanisms and moment estimation for efficiently estimating parameters in the β-model while ensuring a specified level of data privacy. Robust statistical inference is achieved via an adaptive bootstrap procedure, offering both theoretical guarantees and computational advantages. Chang, Jiang, McElroy & Shao (2025+, JASA) are the first to apply Gaussian approximation and parametric bootstrap to high-dimensional parameter inference in the frequency domain. They introduce a novel global test applicable to high-dimensional spectral density matrices, along with a multiple testing procedure that controls the false discovery rate (FDR).