Invited Speaker Series

Fall 2022

A structured covariance ensemble for sufficient dimension reduction

Dr. Yuan Xue, Associate Professor | University of International Business and Economics

Date: November 4, 2022, 8:30-9:30am, virtual via Zoom

Abstract:Sufficient dimension reduction (SDR) is a useful tool for high-dimensional data analysis. SDR aims at reducing the data dimensionality without loss of regression information between the response and its high-dimensional predictors. Many existing SDR methods are designed for the data with continuous responses. Motivated by a recent work on aggregate dimension reduction, we propose a unified SDR framework for both continuous and binary responses through a structured covariance ensemble. The connection with existing approaches is discussed in details and an efficient algorithm is proposed. Numerical examples and a real data application demonstrate its satisfactory performance.

On projective resampling for sufficient dimension reduction with random response objects

Dr. Abdul-Nasah Soalse | University of Notre Dame

Date: October 22, 2022, 8:30-9:30am, virtual via Zoom

Abstract: Technological advancement has led to the collection of novel data with response objects which may not lie in the Euclidean space. Typical scenarios include cases where the response objects are probability distributions, covariance matrices, graph Laplacians, or spheres. A sufficient dimension reduction (SDR) method to solve these types of problems using a projective resampling technique is proposed. The complex response objects are first mapped to a real-valued distance matrix using the appropriate metric, which is then projected onto a unit vector on a hypersphere to obtain univariate Euclidean-valued response. Based on the projected response, the corresponding dimension reduction subspace in the direction of the unit vector is estimated using classical SDR methods such as ordinary least squares, sliced inverse regression, and sliced average variance estimation. Several vectors on the unit hypersphere are sampled and their subspaces averaged to estimate the central subspace. The projection technique avoids the curse of dimensionality associated with generating high dimensional kernels and relies on fewer tuning parameters while preserving the joint distribution of the response object. An extensive simulation study demonstrates the performance of our proposal on synthetic data. The analysis of the distribution of county-level COVID-19 spread in the United States as a function of socio-economic and demographic characteristics is also provided. The theoretical justifications are included as well.

Spring 2023

Sufficient Dimension Reduction for Metric-Space Valued Responses and High-Dimensional Graphical Predictorsuffice

Dr. Jiaying Weng | Bentley University

Date: April 7, 2023

Zoom Link, Time, and Schedule Information contact Vickie Sandlin -

Abstract: Fréchet regression has received considerable attention to model metric-space valued responses that are complex and non-Euclidean data, such as probability distributions and vectors on the unit sphere. However, existing Fréchet regression literature focuses on the classical setting where the predictor dimension is fixed, and the sample size goes to infinity. This paper proposes sparse Fréchet sufficient dimension reduction with graphical structure among high-dimensional Euclidean predictors. Sufficient dimension reduction aims to find linear combinations of predictors without losing regression information on the response. In particular, we propose a convex optimization problem that leverages the graphical information among predictors and avoids inverting the high-dimensional covariance matrix. We also provide the Alternating Direction Method of Multipliers (ADMM) algorithm to solve the optimization problem. Theoretically, the proposed method achieves subspace estimation and variable selection consistency under suitable conditions. Extensive simulations and real data analysis are carried out to illustrate the finite-sample performance of the proposed method.


A Family of Orthogonal Main Effects Screening Designs for Mixed-Level Factors

Dr. Bradley Jones, Distinguished Research Fellow |  JMP Statistical Discovery LLC

Date: April 14, 2023, 11:40-1:00pm EST (via Zoom)

Abstract: There is little literature on screening when some factors are at three levels and others are at two levels. Two well-known and well-worn examples are Taguchi's L18 and L36 designs. However, these designs are limited in two ways. First, they only allow for either 18 or 36 runs, which is restrictive. Second, they provide no protection against bias of the main effects due to active two-factor interactions (2FIs). In this talk, I will introduce a family of orthogonal, mixed-level screening designs in multiples of eight runs. The 16-run design can accommodate up to four continuous three-level factors and up to eight two-level factors. The two-level factors can be either continuous or categorical. All of the designs supply substantial bias protection of the main effects estimates due to active 2FIs. I will show a direct construction of these designs (no optimization algorithm necessary!) using the JSL commands hadamard() and direct product().


Nonparametric vs. Parametric Regression

Dr. Seonjin Kim | Miami University

Date: April 28, 2023, 11:40 a.m. - 1 p.m.

Abstract: To understand nonparametric regression, we should know first what the parametric model is. Simply speaking, the parametric regression model consists of many assumptions and the nonparametric regression model eases the assumptions. I will introduce what assumptions the parametric regression model has and how the nonparametric regression model relieves them. In addition, their pros and cons will be also presented.