My research lies at the intersection of geometry, machine learning, and biomedical data analysis. I study symmetric positive definite matrices, such as covariance and correlation matrices, that arise in applications such as neuroimaging, EEG, and other spatio-temporal systems. Rather than treating these objects in standard Euclidean space, I work in non-Euclidean geometric frameworks where these matrices are modeled as points on a manifold with intrinsic distances
Functional Connectivity in FMRI
Functional connectivity from fMRI encodes interactions between brain regions as covariance or correlation matrices. Using our bures–wasserstein–based tangent-space pipeline, these representations reveal key regions of interest (ROIs) associated with neurological disorder diagnosis
Under the Bures-Wasserstein metric, covariance matrices are mapped from the manifold to the tangent space via barycenter projection, where classical machine learning algorithms operate in a linear space that still preserves intrinsic manifold structure
Projected matrices are classified onto the tangent space
In Progress: "Functional Connectome Classification in the Tangent Space of Bures–Wasserstein Geometry"
Michael Zirpoli*, Yuyan Yi*, Shu-Chin Lin, Linquang Ge, Jingyi Zheng, "Towards Classification of Covariance Matrices via Bures-Wasserstein-Based Machine Learning", Proceedings of the 2024 7th Artificial Intelligence and Cloud Computing Conference, 2024
Matrices are embedded into a reproducing kernel Hilbert space using positive-definite kernels that enable non-linear learning. We established several new positive-definite kernels under the bures-wasserstein metric, including linear, polynomial, and generalized gaussian kernels
In Progress: "Positive Definite Kernels for the Bures-Wasserstein Metric on SPD Matrices"
Varying the α-Procrustes metric parameter reshapes the embedding geometry, demonstrating how metric choice fundamentally influences structure in geometry-aware learning