Jiaqi Leng (冷家祺)
I am a final-year undergraduate student in Computer Science and Technology at Fudan University.
I am currently working with Prof. Yucheng Lu at NYU Shanghai on efficient language modeling, with a focus on byte-level architectures and efficient language modeling. Previously, I worked as a research intern at Ant Group on sparse attention mechanisms for large language models.
My research interests mainly lie in:
- Efficient deep learning and model architectures
- Long-context modeling and length extrapolation
- Sparse attention mechanisms
During Fall 2024, I was an exchange student at The University of Texas at Austin.
Please feel free to reach out! 👋
News
| Mar 2026 | I will be attending ICLR and presenting our work on length-generalizable sparse attention. See you in Brazil! |
|---|---|
| Mar 2026 | I will join NYU in Fall 2026. |