Jingwei Zuo 左敬威
Pronounced /dʒɪŋweɪ dzʊɔ/ — jing-way zwaw
Principal Researcher · Technology Innovation Institute (TII)
I am a Principal Researcher at the Technology Innovation Institute (TII), Abu Dhabi, where I lead the Falcon foundational models team. My work focuses on making large language models simultaneously more capable and more efficient — through novel architecture design, principled training strategies, and large-scale pretraining infrastructure. Recent model releases include Falcon-H1, Falcon-Edge, Falcon 3, and Falcon-Mamba.
I received my PhD (2022) from Université Paris-Saclay, awarded the Plateau de Saclay Doctoral Prize, an M.Sc. in Data Science (2018) from Paris-Saclay, an M.Sc. in CS (2017) from Sorbonne Université, and a B.Sc. in EE from HUST.
News
Selected Publications
Background
My current research centers on making large language models simultaneously more capable and more efficient. On the architecture side, I explore hybrid designs that combine Transformer attention with State Space Models, achieving better quality–throughput trade-offs than either paradigm alone. On the data side, I focus on principled data curation, synthetic data generation, and (anti-) curriculum strategies that improve training efficiency at every scale — from 90M to 34B parameters. This dual focus on architecture and data is the thread connecting Falcon-H1, Falcon-Mamba, and our ongoing work.