About Me
I am a second-year Ph.D. student at Stanford CS, advised by Professor James Zou and Stefano Ermon. Before that, I received my B.S. in Math and Computer Science at Yuanpei College, Peking University, under the advice of Professor Liwei Wang and Di He. I focus on advancing foundational generative AI through research in theory, algorithm design, and high-performance computing. If you’re interested in my work, please feel free to contact me!
News
- (Sept. 2024) Two papers got accepted by NeurIPS 2024, including training-free guidance (Spotlight) and geometric trajectory models. See you in Vancouver!
- (Feb. 2024) Our Forward Laplacian paper is accepted by Nature Machine Intelligence! We release LapJAX, a JAX based package designed for accelerating general second order operators (e.g., Laplacian) computation.
Selected Publications
- (NeurIPS 2024, Spotlight) TFG: Unified Training-Free Guidance for Diffusion Models
Haotian Ye*, Haowei Lin*, Jiaqi Han*, Minkai Xu, Sheng Liu, Yitao Liang, Jianzhu Ma, James Zou, Stefano Ermon
[Paper] - (Nature Machine Intelligence) A computational framework for neural network-based variational Monte Carlo with Forward Laplacian
Ruichen Li*, Haotian Ye*, Du Jiang, Xuelan Wen, Chuwei Wang, Zhe Li, Xiang Li, Di He, Ji Chen, Weiluo Ren, Liwei Wang
[Paper] - (NeurIPS 2023, Oral) Towards Revealing the Mystery behind Chain of Thought: a Theoretical Perspective
Guhao Feng*, Bohang Zhang*, Yuntian Gu*, Haotian Ye*, Di He, Liwei Wang
[Paper] [Video] [Slides] - (ICML 2023, Oral) On the Power of Pre-training for Generalization in RL: Provable Benefits and Hardness
Haotian Ye*, Xiaoyu Chen*, Liwei Wang, Simon Shaolei Du
[Paper] - (AISTATS 2023) Freeze then Train: Towards Provable Representation Learning under Spurious Correlations and Feature Noise
Haotian Ye, James Zou, Linjun Zhang
[Paper] [Code] [Video] [Slides] - (ICLR 2023) Discovering Latent Knowledge in Language Models Without Supervision
Collin Burns*, Haotian Ye*, Dan Klein, Jacob Steinhardt
[Paper] - (J. Chem. Phys. Aug 2023) DeePMD-kit v2: A software package for Deep Potential models
Jinzhe Zeng, Duo Zhang, …, Haotian Ye, …, Weinan E, Roberto Car, Linfeng Zhang, Han Wang
[Paper] - (NeurIPS 2021) Towards a Theoretical Framework of Out-of-Distribution Generalization
Haotian Ye*, Chuanlong Xie, Tianle Cai, Ruichen Li, Zhenguo Li, Liwei Wang
[Paper] [Code] [Video] [Slides]
Selected Awards
- Weiming Scholar of Peking University (1%), 2023
- Person of the Year of Peking University (10 people/year), 2021
- May 4 scholarship (1%, Rank 1), 2021
- National scholarship (1%, Rank 2), 2019
- Leo Koguan scholarship (1%), 2020
- Merit student pacesetter (2%), 2019
- Chinese Mathematical Olympiad (First Prize, Rank 7 in China), 2017