Jingwei Zuo 左敬威
Pronounced /dʒɪŋweɪ dzʊɔ/ — jing-way zwaw
Principal Researcher · Technology Innovation Institute (TII)
I am a Principal Researcher at the Technology Innovation Institute (TII), Abu Dhabi, where I lead the Falcon foundational models team. My research focuses on making large language models simultaneously more capable and more efficient, with a particular emphasis on principled data curation, large-scale pretraining, and novel architecture design. Recent model releases include Falcon-H1, Falcon-Edge, Falcon 3, and Falcon-Mamba.
I received my PhD (2022) from Université Paris-Saclay, awarded the Plateau de Saclay Doctoral Prize, an M.Sc. in Data Science (2018) from Paris-Saclay, an M.Sc. in CS (2017) from Sorbonne Université, and a B.Sc. in EE from HUST.
News
Selected Publications
Background
My current research centers on making large language models simultaneously more capable and more efficient. On the data side, I focus on large-scale pretraining data preparation, principled data curation, synthetic data generation, data mixture strategies, and (anti-) curriculum schedules that improve training efficiency at every scale — from 90M to 34B parameters. On the architecture side, I explore hybrid designs that combine Transformer attention with State Space Models, achieving better quality–throughput trade-offs than either paradigm alone. This dual focus on data and architecture is the thread connecting Falcon-H1, Falcon-Mamba, and our ongoing work.