Gefei(Frank) Gu 顾格非

Carnegie Mellon University

27761773600270_.pic.jpg

CMU LTI, Pittsburgh, PA

Email LinkedIn Twitter

I am currently a master’s student at Carnegie Mellon University in the Language Technologies Institute, working with Prof. Daniel Fried and Prof. Sean Welleck on efficient reasoning for language models and agents. I previously completed my undergraduate studies in Statistics at Zhejiang University. I have worked as a research intern at Alibaba Group, focusing on large-scale reinforcement learning to enhance the agentic search capabilities of LLMs.

Previously, I have worked with Prof. Arman Cohan and the YaleNLP group on LLM evaluations and information retrieval. I have also worked with Prof. Yang Yang on AI4Science.

Honors and Awards

  • Outstanding Student, Zhejiang University
  • First Prize Scholarship, Zhejiang University (Top 3%)

News

Aug 01, 2025 Joining CMU as a master student!
Jul 15, 2025 Serving as a reviewer for EMNLP Demo Track.
Jun 20, 2025 One Paper (Product-Searcher) submitted to EMNLP Industry Track!
May 10, 2025 One Paper (Ref-long) accepted by ACL main!
Mar 01, 2025 Joining Alibaba Group as a Research Intern!

Selected publications

  1. Industry Track
    accio.jpg
    Product-Searcher: Real-World E-commerce Product DeepSearch via Reinforcement Learning
    Gefei Gu*, Wenhui Chen, Qiankun Shi, and 5 more authors
    2025
    Submitted to EMNLP Industry Track
  2. ACL 2025
    reflong.jpg
    Ref-Long: Benchmarking the Long-context Referencing Capability of Long-context Language Models
    Junjie Wu*, Gefei Gu*, Yanan Zheng, and 2 more authors
    arXiv preprint arXiv:2507.09506, 2025
  3. EMNLP 2024
    TAIL.jpg
    TAIL: A Toolkit for Automatic and Realistic Long-Context Large Language Model Evaluation
    Gefei Gu, Yilun Zhao, Ruoxi Ning, and 2 more authors
    EMNLP 2024 (Demo Track), 2024
  4. Preprint
    brainwave.jpg
    BrainWave: A Brain Signal Foundation Model for Clinical Applications
    Zhizhang Yuan, Fanqi Shen, Meng Li, and 5 more authors
    arXiv preprint arXiv:2402.10251, 2024