Jingwei Zuo

Jingwei Zuo 左敬威

Pronounced /dʒɪŋweɪ dzʊɔ/jing-way zwaw

Principal Researcher · Technology Innovation Institute (TII)

I lead the Falcon Foundational Models team at TII, Abu Dhabi. My team and I drive the development of the Falcon family of large language models — including Falcon-H1, Falcon-Edge, Falcon 3, and Falcon-Mamba. Our work spans the full LLM lifecycle: novel architecture design, data curation, large-scale training, and deployment.

I received my PhD (2022) from Université Paris-Saclay, awarded the Plateau de Saclay Doctoral Prize, an M.Sc. in Data Science (2018) from Paris-Saclay, an M.Sc. in CS (2017) from Sorbonne Université, and a B.Sc. in EE from HUST.

Featured Projects

View all projects

News

Jan 2026 Falcon-H1-Tiny released — 90M to 0.6B NEW
Jan 2026 Learnable Multiplier paper out NEW
Jul 2025 Falcon-H1 technical report released
Jun 2025 E2LM competition accepted at NeurIPS'25
May 2025 Falcon-H1 released

Selected Publications

2026
M. Velikanov, I. Chahed, J. Zuo, D. E. Rhaiem, Y. Belkada, H. Hacid.
“Learnable Multipliers: Freeing the Scale of Language Model Matrix Layers”
arXiv, 2026.
2025
J. Zuo, M. Velikanov, I. Chahed, Y. Belkada, D. E. Rhaiem, G. Kunsch, et al.
“Falcon-H1: A Family of Hybrid-Head Language Models Redefining Efficiency and Performance”
arXiv, 2025.
2024
J. Zuo, M. Velikanov, D. E. Rhaiem, I. Chahed, Y. Belkada, G. Kunsch, H. Hacid.
“Falcon Mamba: The First Competitive Attention-free 7B Language Model”
arXiv, 2024.

Background

Research Interests

My current research centers on making large language models simultaneously more capable and more efficient. On the architecture side, I explore hybrid designs that combine Transformer attention with State Space Models, achieving better quality–throughput trade-offs than either paradigm alone. On the data side, I focus on principled data curation, synthetic data generation, and (anti-) curriculum strategies that improve training efficiency at every scale — from 90M to 34B parameters. This dual focus on architecture and data is the thread connecting Falcon-H1, Falcon-Mamba, and our ongoing work.

Education
Ph.D. in Machine Learning
Université Paris-Saclay
Plateau de Saclay Doctoral Prize
2022
M.Sc. in Data Science
Université Paris-Saclay
2018
M.Sc. in CS
Sorbonne Université
2017
B.Sc. in EE
Huazhong University of Science & Technology (HUST)
2015

Contact

Address
PO Box 9639, Masdar City, Abu Dhabi, UAE