[98th TrustML Young Scientist Seminar] Talk by Prof. Qiong Zhang (Renmin University of China) "Robust and Scalable Federated Learning of Finite Mixture Models: Addressing Label Switching and Byzantine Failures"
Date and Time: August 29, 2025, 10:00 - 11:00 (JST)
Venue: Online and Open Space at the RIKEN AIP Nihonbashi office
*Open Space is available to AIP researchers only
Title: Robust and Scalable Federated Learning of Finite Mixture Models: Addressing Label Switching and Byzantine Failures
Speaker: Prof. Qiong Zhang (Renmin University of China)
Abstract: The rise of large-scale and privacy-sensitive data has led to growing demand for federated learning methods that are both scalable and robust. In this talk, I will present a principled framework for federated learning of finite mixture models, a class of flexible models widely used in clustering, biology, finance, and beyond. Traditional approaches break down in this setting due to the well-known label switching problem. We first introduce a Mixture Reduction approach that resolves this challenge by aggregating local estimates via optimal transport, enabling consistent and statistically efficient estimation under the split-and-conquer paradigm. We then address a critical vulnerability of distributed systems—Byzantine failure—where a fraction of machines may transmit arbitrary, possibly adversarial local estimates. To tackle this, we propose Distance-Filtered Mixture Reduction (DFMR), which identifies and filters corrupted local estimates using pairwise L2 distances between densities. DFMR achieves optimal convergence rates under standard assumptions and is asymptotically equivalent to the oracle estimator in many practical scenarios. I will conclude with empirical results on both simulated and real-world data that demonstrate the accuracy, robustness, and scalability of the proposed methods.
Bio: Qiong Zhang is an assistant professor at the Institute of Statistics and Big Data, Renmin University of China. She received her Ph.D. in Statistics from the University of British Columbia in 2022 and her B.Sc. from the School of the Gifted Young at the University of Science and Technology of China in 2015. Her research focuses on federated and distributed learning—particularly under mixture models—and the application of advanced deep learning techniques.