研究成果
过滤器
输入关键词搜索

张佳, & Chen, X. (2020). Principal envelope model. Journal of Statistical Planning and Inference, 206, 249-262.

Principal component analysis (PCA) is widely used in various fields to reduce high dimensional data sets to lower dimensions. Traditionally, the first a few principal components that capture most of the variance in the data are thought to be important. Tipping and Bishop (1999) introduced probabilistic principal component analysis (PPCA) in which they assumed an isotropic error in a latent variable model. Motivated by a general error structure and incorporating the novel idea of ‘‘envelope” proposed by Cook et al. (2010), we construct principal envelope models (PEM) which demonstrate the possibility that any subset of the principal components could retain most of the sample’s information. The useful principal components can be found through maximum likelihood approaches. We also embed the PEM to a factor model setting to illustrate its reasonableness and validity. Numerical results indicate the potentials of the proposed method.

Read More »

张佳, Shi, H., Tian, L., & Xiao, F. (2019). Penalized generalized empirical likelihood in high-dimensional weakly dependent data. Journal of Multivariate Analysis, 171, 270-283.

In this paper, we propose a penalized generalized empirical likelihood (PGEL) approach based on the smoothed moment functions Anatolyev (2005), Smith (1997), Smith (2004) for parameters estimation and variable selection in the growing (high) dimensional weakly dependent time series setting. The dimensions of the parameters and moment restrictions are both allowed to grow with the sample size at some moderate rates. The asymptotic properties of the estimators of the smoothed generalized empirical likelihood (SGEL) and its penalized version (SPGEL) are then obtained by properly restricting the degree of data dependence. It is shown that the SPGEL estimator maintains the oracle property despite the existence of data dependence and growing (high) dimensionality. We finally present simulation results and a real data analysis to illustrate the finite-sample performance and applicability of our proposed method.

Read More »

张佳, & Chen, X. (2019). Robust sufficient dimension reduction via ball covariance. Computational Statistics & Data Analysis, 140, 144-154.

Sufficient dimension reduction is an important branch of dimension reduction, which includes variable selection and projection methods. Most of the sufficient dimension reduction methods are sensitive to outliers and heavy-tailed predictors, and require strict restrictions on the predictors and the response. In order to widen the applicability of sufficient dimension reduction, we propose BCov-SDR, a novel sufficient dimension reduction approach that is based on a recently developed dependence measure: ball covariance. Compared with other popular sufficient dimension reduction methods, our approach requires rather mild conditions on the predictors and the response, and is robust to outliers or heavy-tailed distributions. BCov-SDR does not require the specification of a forward regression model and allows for discrete or categorical predictors and multivariate response. The consistency of the BCov-SDR estimator of the central subspace is obtained without imposing any moment conditions on the predictors. Simulations and real data studies illustrate the applicability and versatility of our proposed method.

Read More »

鲁万波, 陈骋, & 王建业. (2019). 资产组合非等间隔日内在险价值研究. 数理统计与管理, 38, 1104-1118.

当前对资产组合在险价值(VaR)的研究仅限于等间隔抽样数据的建模。本文提出资产组合的非等间隔日内在险价值 (Irregularly Spaced Intraday Value at Risk, ISIVaR)研究方法,克服资产组合逐笔交易数据非等间隔且不同步问题,利用逐笔交易数据所包含的丰富市场微观结构信息对VaR进行估计。该方法基于更新时间方法将非同步的资产组合标值序列同步化;运用Copula理论建立资产组合的非等间隔日内波动率模型,并捕捉资产组合中各资产在截面上的相关关系;最后利用这种截面相关关系,使用蒙特卡洛模拟技术估计出资产组合的 ISIVaR。实证部分利用真实的逐笔交易数据验证了上述方法的有效性。

Read More »

常晋源, Tang, C. Y., & Wu, T. T. (2018). A new scope of penalized empirical likelihood with high-dimensional estimating equations. Annals of Statistics, 46, 3185-3216.

Statistical methods with empirical likelihood (EL) are appealing and ef- fective especially in conjunction with estimating equations for flexibly and adaptively incorporating data information. It is known that EL approaches en- counter difficulties when dealing with high-dimensional problems. To over- come the challenges, we begin our study with investigating high-dimensional EL from a new scope targeting at high-dimensional sparse model parame- ters. We show that the new scope provides an opportunity for relaxing the stringent requirement on the dimensionality of the model parameters. Moti- vated by the new scope, we then propose a new penalized EL by applying two penalty functions respectively regularizing the model parameters and the associated Lagrange multiplier in the optimizations of EL. By penalizing the Lagrange multiplier to encourage its sparsity, a drastic dimension reduction in the number of estimating equations can be achieved. Most attractively, such a reduction in dimensionality of estimating equations can be viewed as a selection among those high-dimensional estimating equations, resulting in a highly parsimonious and effective device for estimating high-dimensional sparse model parameters. Allowing both the dimensionalities of model pa- rameters and estimating equations growing exponentially with the sample size, our theory demonstrates that our new penalized EL estimator is sparse and consistent with asymptotically normally distributed nonzero components. Numerical simulations and a real data analysis show that the proposed penal- ized EL works promisingly.

Read More »

常晋源, Guo, B., & Yao, Q. (2018). Principal component analysis for second-order stationary vector time series. Annals of Statistics, 46, 2094-2124.

We extend the principal component analysis (PCA) to second-order sta- tionary vector time series in the sense that we seek for a contemporaneous linear transformation for a p-variate time series such that the transformed series is segmented into several lower-dimensional subseries, and those sub- series are uncorrelated with each other both contemporaneously and serially. Therefore, those lower-dimensional series can be analyzed separately as far as the linear dynamic structure is concerned. Technically, it boils down to an eigenanalysis for a positive definite matrix. When p is large, an additional step is required to perform a permutation in terms of either maximum cross- correlations or FDR based on multiple tests. The asymptotic theory is es- tablished for both fixed p and diverging p when the sample size n tends to infinity. Numerical experiments with both simulated and real data sets indi- cate that the proposed method is an effective initial step in analyzing multiple time series data, which leads to substantial dimension reduction in modelling and forecasting high-dimensional linear dynamical structures. Unlike PCA for independent data, there is no guarantee that the required linear transforma- tion exists. When it does not, the proposed method provides an approximate segmentation which leads to the advantages in, for example, forecasting for future values. The method can also be adapted to segment multiple volatility processes.

Read More »

常晋源, Qiu, Y., Yao, Q., & Zou, T. (2018). Confidence regions for entries of a large precision matrix. Journal of Econometrics, 206, 57-82.

We consider the statistical inference for high-dimensional precision matrices. Specifically, we propose a data-driven procedure for constructing a class of simultaneous confidence regions for a subset of the entries of a large precision matrix. The confidence regions can be applied to test for specific structures of a precision matrix, and to recover its nonzero components. We first construct an estimator for the precision matrix via penalized node- wise regression. We then develop the Gaussian approximation to approximate the distribu- tion of the maximum difference between the estimated and the true precision coefficients. A computationally feasible parametric bootstrap algorithm is developed to implement the proposed procedure. The theoretical justification is established under the setting which allows temporal dependence among observations. Therefore the proposed procedure is applicable to both independent and identically distributed data and time series data. Numerical results with both simulated and real data confirm the good performance of the proposed method.

Read More »

常晋源, Delaigle, A., Hall, P., & Tang, C. Y. (2018). A frequency domain analysis of the error distribution from noisy high-frequency data. Biometrika, 105, 353-369.

Data observed at a high sampling frequency are typically assumed to be an additive composite of a relatively slow-varying continuous-time component, a latent stochastic process or smooth random function, and measurement error. Supposing that the latent component is an Itô diffusion process, we propose to estimate the measurement error density function by applying a deconvolu- tion technique with appropriate localization. Our estimator, which does not require equally-spaced observed times, is consistent and minimax rate-optimal. We also investigate estimators of the moments of the error distribution and their properties, propose a frequency domain estimator for the integrated volatility of the underlying stochastic process, and show that it achieves the optimal convergence rate. Simulations and an application to real data validate our analysis.

Read More »

常晋源, Guo, J., & Tang, C. Y. (2018). Peter Hall’s contribution to empirical likelihood. Statistica Sinica, 28, 2375-2387.

We deeply mourn the loss of Peter Hall. Peter was the premier mathematical statistician of his era. His work illuminated many aspects of statistical thought. While his body of work on bootstrap and nonparametric smoothing is widely known and appreciated, less well known is his work in many other areas. In this article, we review Peter’s contribution to empirical likelihood (EL). Peter has done fundamental work on studying the coverage accuracy of confidence regions constructed with EL.

Read More »