Speaker
Description
Graphs are a popular mathematical abstraction for systems of relations and interactions that can be applied in various domains such as physics, biology, social sciences, etc. Towards unleashing the power of machine learning models for graphs, one fundamental challenge is how to obtain high-quality representations for graph-structured data with diverse scales and properties. We will talk about recent advances in building scalable Transformers as general-purpose encoders for graphs, including NodeFormer, SGFormer and DIFFormer. The first part will introduce NodeFormer which can flexibly model all-pair interactions within linear complexity. The second part will present SGFormer that further simplifies the model and scales to billion-sized graphs. The third part will introduce DIFFormer, a Transformer derived from principled diffusion process, and the latter lends us an interpretable way to understand the mechanism of scalable Transformers.