Jiaqi Leng (冷家祺)
I am an incoming PhD student at NYU and am currently completing my undergraduate studies in Computer Science and Technology at Fudan University.
Under the supervision of Prof. Yucheng Lu, my current research focuses on efficient language modeling, with particular emphasis on byte-level architectures. I previously worked as a research intern at Ant Group, where I studied sparse attention mechanisms for large language models.
My research interests include:
- Efficient deep learning and model architectures
- Long-context modeling and length extrapolation
- Sparse attention mechanisms
In Fall 2024, I was an exchange student at The University of Texas at Austin.
I welcome opportunities for academic collaboration and discussion.
News
| Mar 2026 | I will be attending ICLR and presenting our work on length-generalizable sparse attention. See you in Brazil! |
|---|---|
| Mar 2026 | I will join NYU as a PhD student in Fall 2026. |