[2025-01-18] For better promotion of the events, the categories in this system will be adjusted. For details, please refer to the announcement of this system. The link is https://indico-tdli.sjtu.edu.cn/news/1-warm-reminder-on-adjusting-indico-tdli-categories-indico

10–13 Jul 2024
Pao Yue-Kong Library
Asia/Shanghai timezone

Towards Graph Transformers at Scale

12 Jul 2024, 15:55
45m
Pao Yue-Kong Library

Pao Yue-Kong Library

500
邀请报告 人工智能和机器学习的应用 人工智能和机器学习的应用

Speaker

Qitian Wu (Shanghai Jiao Tong University)

Description

Graphs are a popular mathematical abstraction for systems of relations and interactions that can be applied in various domains such as physics, biology, social sciences, etc. Towards unleashing the power of machine learning models for graphs, one fundamental challenge is how to obtain high-quality representations for graph-structured data with diverse scales and properties. We will talk about recent advances in building scalable Transformers as general-purpose encoders for graphs, including NodeFormer, SGFormer and DIFFormer. The first part will introduce NodeFormer which can flexibly model all-pair interactions within linear complexity. The second part will present SGFormer that further simplifies the model and scales to billion-sized graphs. The third part will introduce DIFFormer, a Transformer derived from principled diffusion process, and the latter lends us an interpretable way to understand the mechanism of scalable Transformers.

Primary author

Qitian Wu (Shanghai Jiao Tong University)

Presentation materials