検索条件

キーワード
タグ
ツール
開催日
こだわり条件

タグ一覧

JavaScript
PHP
Java
Ruby
Python
Perl
Scala
Haskell
C言語
C言語系
Google言語
デスクトップアプリ
スマートフォンアプリ
プログラミング言語
U/UX
MySQL
RDB
NoSQL
全文検索エンジン
全文検索
Hadoop
Apache Spark
BigQuery
サーバ構成管理
開発サポートツール
テストツール
開発手法
BI
Deep Learning
自然言語処理
BaaS
PaaS
Iaas
Saas
クラウド
AI
Payment
クラウドソフトウェア
仮想化ソフトウェア
OS
サーバ監視
ネットワーク
WEBサーバ
開発ツール
テキストエディタ
CSS
HTML
WEB知識
CMS
WEBマーケティング
グラフィック
グラフィックツール
Drone
AR
マーケット知識
セキュリティ
Shell
IoT
テスト
Block chain
知識

[ABI Team Seminar] Jia-Jie Zhu: Computational Gradient Flows

2025/05/19(月)
02:30〜04:00
Googleカレンダーに追加
参加者

61人/

主催:RIKEN AIP Public

This talk will be held in a hybrid format, both in person at AIP Open Space of RIKEN AIP (Nihonbashi office) and online by Zoom. AIP Open Space: *only available to AIP researchers.

DATE, TIME & LOCATION
Monday, May 19th, 11:30 - 13:00, RIKEN AIP Nihombashi Office, Open Space

TITLE
Computational Gradient Flows: the Analysis of Minimizing Relative Entropy for Inference, Sampling, and Optimization

BIO
Jia-Jie Zhu (https://jj-zhu.github.io/) is a machine learner, applied mathematician, and research group leader at the Weierstrass Institute, Berlin. Previously, he worked as a postdoctoral researcher in machine learning at the Max-Planck-Institute for Intelligent Systems, Tübingen, and received his Ph.D. training in optimization, at the University of Florida, USA. He is interested in the intersection of machine learning, analysis, and optimization, on topics such as (PDE) gradient flows of probability measures, optimal transport, and robustness of learning and optimization algorithms.

ABSTRACT
Many problems in machine learning can be framed as optimization problems that minimize the Kullback–Leibler divergence between two probability measures. Such problems appear in sampling, variational inference, generative modeling, and reinforcement learning, etc. In this talk, I will focus on the computational aspects of KL-minimization, building upon recent advances in the mathematical foundation of optimal transport and PDE analysis, specifically on the Hellinger-Kantorovich (a.k.a. Wasserstein-Fisher-Rao) gradient flows and the associated functional inequalities. I will then present concrete computational algorithms derived from WFR gradient flows, with applications to sampling and inference. The analysis also showcase a deepened understanding of the connection between the Fisher-Rao gradient flows and kernel methods for machine learning.

Workship