About Me
I am a second-year Ph.D. student at Stanford CS, advised by Professor James Zou and Stefano Ermon. Before that, I received my B.S. in Math and Computer Science at Yuanpei College, Peking University, under the advice of Professor Liwei Wang and Di He. I focus on advancing foundational generative AI through algorithm design and high-performance computing. If you’re interested in my work, please feel free to contact me!
News
- (Apr. 2025) I am currently working part-time at NVIDIA Deep Imagination Research and will join as full-time intern this summer in Santa Clara.
- (Apr. 2025) Google intern work LLM Constrained Decoding was accepted by AISTATS 2025, and ICV-Hallucination by ICLR 2025 (Spotlight). See you in Singapore and Phuket!
- (Sept. 2024) Two papers was accepted by NeurIPS 2024, including training-free guidance (Spotlight) and geometric trajectory models. See you in Vancouver!
Selected Publications
- (AISTATS 2025) Efficient and Asymptotically Unbiased Constrained Decoding for Large Language Models
Haotian Ye, Himanshu Jain, Chong You, Ananda Theertha Suresh, Haowei Lin, James Zou, Felix Yu
[Paper] - (NeurIPS 2024, Spotlight) TFG: Unified Training-Free Guidance for Diffusion Models
Haotian Ye*, Haowei Lin*, Jiaqi Han*, Minkai Xu, Sheng Liu, Yitao Liang, Jianzhu Ma, James Zou, Stefano Ermon
[Paper] - (Nature Machine Intelligence) A computational framework for neural network-based variational Monte Carlo with Forward Laplacian
Ruichen Li*, Haotian Ye*, Du Jiang, Xuelan Wen, Chuwei Wang, Zhe Li, Xiang Li, Di He, Ji Chen, Weiluo Ren, Liwei Wang
[Paper] - (NeurIPS 2023, Oral) Towards Revealing the Mystery behind Chain of Thought: a Theoretical Perspective
Guhao Feng*, Bohang Zhang*, Yuntian Gu*, Haotian Ye*, Di He, Liwei Wang
[Paper] [Video] [Slides] - (ICML 2023, Oral) On the Power of Pre-training for Generalization in RL: Provable Benefits and Hardness
Haotian Ye*, Xiaoyu Chen*, Liwei Wang, Simon Shaolei Du
[Paper] - (AISTATS 2023) Freeze then Train: Towards Provable Representation Learning under Spurious Correlations and Feature Noise
Haotian Ye, James Zou, Linjun Zhang
[Paper] [Code] [Video] [Slides] - (ICLR 2023) Discovering Latent Knowledge in Language Models Without Supervision
Collin Burns*, Haotian Ye*, Dan Klein, Jacob Steinhardt
[Paper] - (J. Chem. Phys. Aug 2023) DeePMD-kit v2: A software package for Deep Potential models
Jinzhe Zeng, Duo Zhang, …, Haotian Ye, …, Weinan E, Roberto Car, Linfeng Zhang, Han Wang
[Paper] - (NeurIPS 2021) Towards a Theoretical Framework of Out-of-Distribution Generalization
Haotian Ye*, Chuanlong Xie, Tianle Cai, Ruichen Li, Zhenguo Li, Liwei Wang
[Paper] [Code] [Video] [Slides]
Selected Awards
- Weiming Scholar of Peking University (1%), 2023
- Person of the Year of Peking University (10 people/year), 2021
- May 4 scholarship (1%, Rank 1), 2021
- National scholarship (1%, Rank 2), 2019
- Leo Koguan scholarship (1%), 2020
- Merit student pacesetter (2%), 2019
- Chinese Mathematical Olympiad (First Prize, Rank 7 in China), 2017