Liangze Jiang

Liangze Jiang

PhD Student

EPFL๐Ÿ‡จ๐Ÿ‡ญ

๐Ÿ—ฟ Short Bio

Hi, my name is Liangze Jiang (๐Ÿ‡จ๐Ÿ‡ณ: ๅงœ่‰ฏๆณฝ). Iโ€™m a first-year PhD student in computer science at Swiss Federal Institute of Technology Lausanne (EPFL), where I also did my MSc and worked with Prof. Amir Zamir and Prof. Martin Jaggi. During my MSc, I also spent half year as a student researcher at Google Research (Zurich), working with Dr. Claudiu Musat and Dr. Pedro Gonnet. I got my Bachelor degree from University of Electronic Science and Technology of China (UESTC) in Chengdu๐Ÿผ.

I had fun research experiences on robustness to distribution shift, graph neural networks, out-of-distribution generalization, as well as CV & NLP topics. Currently, I am broadly interested in building robust and real-world generalizable deep models, and understanding their underlying dynamics & mechanisms empirically and theoretically. I am recently also excited about the applications of deep learning (e.g., GNNs) on the biomedical side.


๐Ÿ“ƒ Publications

(* denotes equal contribution)

Understanding and Manipulating Agreement between Neural Networks
โ€ƒ โ€ƒ L. Jiang
โ€ƒ โ€ƒ Master’s Thesis 2023
โ€ƒ โ€ƒ [Poster]

TF-GNN: Graph Neural Networks in TensorFlow
โ€ƒ โ€ƒ O. Ferludin, A. Eigenwillig, and 25 others, including L. Jiang, B. Perozzi
โ€ƒ โ€ƒ arXiv 2023
โ€ƒ โ€ƒ [Paper] [Code]

Test-time Robust Personalization for Federated Learning
โ€ƒ โ€ƒ L. Jiang*, T. Lin*
โ€ƒ โ€ƒ ICLR 2023 - International Conference on Learning Representations
โ€ƒ โ€ƒ [Paper] [Code]


๐Ÿ”ฌ Research Experiences

10.2022 - 05.2023, Master Thesis & Project Student @ Visual Intelligence and Learning Lab, EPFL
โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โžก๏ธ On agreement between NNs & OOD generalization through multi-hypothesis
02.2022 - 08.2022, Research Intern (Student Researcher) @ Google Research (Zurich)
โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โžก๏ธ On efficient and accurate graph attention
09.2021 - 04.2022, Project Student @ Machine Learning and Optimization Lab, EPFL
โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โ€ƒ โžก๏ธ On robust personalization for federated learning
06.2021 - 09.2021, NLP Engineer Intern @ Institute of Software, Chinese Academy of Sciences


๐Ÿ† Awards

  • EPFL EDIC PhD Fellowship (~ 60k $), 2023
  • Outstanding Graduate Award, 2020, Ranking: 6/244
  • Undergraduate Academic Excellence Scholarship, 2016-2020, Top 10%
  • China National Scholarship, 2019, Top 1%
  • Meritorious Winner in MCM (International Mathematical Contest in Modeling), 2019, Top 7%
  • Sekorm Scholarship, 2018, Top 3%
  • China National Scholarship, 2017, Top 1%

๐Ÿ‘จโ€๐ŸŽ“ Education

09.2023 - Present, PhD Student, Computer Science, EPFL
09.2020 - 02.2023, MSc, Electrical and Electronics Engineering, EPFL
09.2016 - 06.2020, BEng, Information and Communication Engineering, UESTC


๐Ÿ”“ Open-Source Contribution


last updated in 08.2023, in progress