検索条件

キーワード
タグ
ツール
開催日
こだわり条件

タグ一覧

JavaScript
PHP
Java
Ruby
Python
Perl
Scala
Haskell
C言語
C言語系
Google言語
デスクトップアプリ
スマートフォンアプリ
プログラミング言語
U/UX
MySQL
RDB
NoSQL
全文検索エンジン
全文検索
Hadoop
Apache Spark
BigQuery
サーバ構成管理
開発サポートツール
テストツール
開発手法
BI
Deep Learning
自然言語処理
BaaS
PaaS
Iaas
Saas
クラウド
AI
Payment
クラウドソフトウェア
仮想化ソフトウェア
OS
サーバ監視
ネットワーク
WEBサーバ
開発ツール
テキストエディタ
CSS
HTML
WEB知識
CMS
WEBマーケティング
グラフィック
グラフィックツール
Drone
AR
マーケット知識
セキュリティ
Shell
IoT
テスト
Block chain
知識

[98th TrustML Young Scientist Seminar] Talk by Prof. Qiong Zhang (Renmin University of China) "Robust and Scalable Federated Learning of Finite Mixture Models: Addressing Label Switching and Byzantine Failures"

2025/08/29(金)
01:00〜02:00
Googleカレンダーに追加
参加者

3人/

主催:RIKEN AIP Public

Date and Time: August 29, 2025, 10:00 - 11:00 (JST)
Venue: Online and Open Space at the RIKEN AIP Nihonbashi office
*Open Space is available to AIP researchers only

Title: Robust and Scalable Federated Learning of Finite Mixture Models: Addressing Label Switching and Byzantine Failures

Speaker: Prof. Qiong Zhang (Renmin University of China)

Abstract: The rise of large-scale and privacy-sensitive data has led to growing demand for federated learning methods that are both scalable and robust. In this talk, I will present a principled framework for federated learning of finite mixture models, a class of flexible models widely used in clustering, biology, finance, and beyond. Traditional approaches break down in this setting due to the well-known label switching problem. We first introduce a Mixture Reduction approach that resolves this challenge by aggregating local estimates via optimal transport, enabling consistent and statistically efficient estimation under the split-and-conquer paradigm. We then address a critical vulnerability of distributed systems—Byzantine failure—where a fraction of machines may transmit arbitrary, possibly adversarial local estimates. To tackle this, we propose Distance-Filtered Mixture Reduction (DFMR), which identifies and filters corrupted local estimates using pairwise L2 distances between densities. DFMR achieves optimal convergence rates under standard assumptions and is asymptotically equivalent to the oracle estimator in many practical scenarios. I will conclude with empirical results on both simulated and real-world data that demonstrate the accuracy, robustness, and scalability of the proposed methods.

Bio: Qiong Zhang is an assistant professor at the Institute of Statistics and Big Data, Renmin University of China. She received her Ph.D. in Statistics from the University of British Columbia in 2022 and her B.Sc. from the School of the Gifted Young at the University of Science and Technology of China in 2015. Her research focuses on federated and distributed learning—particularly under mixture models—and the application of advanced deep learning techniques.

似たイベント

Workship