About Me

I am a first-year CS Ph.D. at Stanford University, currently working with Professor James Zou. Before that, I received my B.S. at Peking University advised by Professor Liwei Wang and Di He.

I am interested in making machine learning tools more powerful in scientific problems through theories, algorithms design, and high performance computing. If you are interested in my research, please feel free to contact me!

News

  • (Oct. 2023) We release LapJAX, a JAX based package designed for efficient second order operators (e.g., Laplacian) computation. We used it to significantly advance the solving of multibody Schrödinger equations by an order of magnitude. Give it a try!
  • (Sept. 2023) Our chain-of-thought theory paper is accepted for an oral presentation at NeurIPS 2023. See you in New Orleans!
  • (Feb. 2023) I am selected as a “top reviewer” for AISTAT 2023.

Publications

  • (Under Review) Forward Laplacian: A New Computational Framework for Neural Network-based Variational Monte Carlo
    Ruichen Li*, Haotian Ye*, Du Jiang, Xuelan Wen, Chuwei Wang, Zhe Li, Xiang Li, Di He, Ji Chen, Weiluo Ren, Liwei Wang
    [Arxiv]
  • (NeurIPS 2023, Oral) Towards Revealing the Mystery behind Chain of Thought: a Theoretical Perspective
    Guhao Feng*, Bohang Zhang*, Yuntian Gu*, Haotian Ye*, Di He, Liwei Wang
    [Arxiv] [Video] [Slides]
  • (ICML 2023, Oral) On the Power of Pre-training for Generalization in RL: Provable Benefits and Hardness
    Haotian Ye*, Xiaoyu Chen*, Liwei Wang, Simon Shaolei Du
    [Arxiv]
  • (AISTATS 2023) Freeze then Train: Towards Provable Representation Learning under Spurious Correlations and Feature Noise
    Haotian Ye, James Zou, Linjun Zhang
    [Arxiv] [Code] [Video] [Slides]
  • (ICLR 2023) Discovering Latent Knowledge in Language Models Without Supervision
    Collin Burns*, Haotian Ye*, Dan Klein, Jacob Steinhardt
    [Arxiv]
  • (J. Chem. Phys. Aug 2023) DeePMD-kit v2: A software package for Deep Potential models
    Jinzhe Zeng, Duo Zhang, …, Haotian Ye, …, Weinan E, Roberto Car, Linfeng Zhang, Han Wang
    [Paper]
  • (NeurIPS 2021) Towards a Theoretical Framework of Out-of-Distribution Generalization
    Haotian Ye*, Chuanlong Xie, Tianle Cai, Ruichen Li, Zhenguo Li, Liwei Wang
    [Arxiv] [Code] [Video] [Slides]
  • (Preprint) Risk Variance Penalization
    Chuanlong Xie, Haotian Ye, Fei Chen, Yue Liu, Rui Sun, Zhenguo Li
    [Arxiv]
  • (Preprint) Out-of-Distribution Generalization Analysis via Influence Function
    Haotian Ye* , Chuanlong Xie, Yue Liu, Zhenguo Li
    [Arxiv]

Experiences

  • Peking University (Key Laboratory of Machine Perception)
    Undergrad Research (advisor: Prof. Liwei Wang, Department of Machine Intelligence)
    Sept. 2019 – Present

  • Stanford University
    Remote summer research (advisor: Prof. James Zou, Biomedical Data Science; Prof. Linjun Zhang, Department of Statistics)
    June 2022 - Oct. 2022
  • University of California, Berkeley
    Invited visiting research (advisor: Prof. Jacob Steinhardt, Department of Statistics)
    Sponsored by Department of Statistics (Berkeley) and Yuanpei College (Peking University)
    Aug. 2021 - Dec. 2021
  • University of Washington
    Remote research (advisor: Prof. Simon Shaolei Du, Paul G. Allen School of Computer Science & Engineering)
    June 2021 - Sept. 2022
  • DP Technology
    Intern (advisor: Linfeng Zhang, Co-founder & Chief Scientist of DP Technology)
    Feb. 2021 - May 2021
  • Huawei Noah’s Ark Lab
    Intern research (advisor: Zhenguo Li, Noah’s Ark Lab)
    July 2019 - Oct. 2019

Selected Awards

  • Weiming Scholar of Peking University (1%), 2023
  • Person of the Year of Peking University (10 people/year), 2021
  • May 4 scholarship (1%, Rank 1), 2021
  • National scholarship (1%, Rank 2), 2019
  • Leo Koguan scholarship (1%), 2020
  • Merit student pacesetter (2%), 2019
  • Chinese Mathematical Olympiad (First Prize, Rank 7 in China), 2017