Jingwei Zuo 左敬威
Pronounced /dʒɪŋweɪ dzʊɔ/ — jing-way zwaw
Principal Researcher · Technology Innovation Institute (TII)
I lead the Falcon Foundational Models team at TII, Abu Dhabi. My team and I drive the development of the Falcon family of large language models — including Falcon-H1, Falcon-Edge, Falcon 3, and Falcon-Mamba. Our work spans the full LLM lifecycle: novel architecture design, data curation, large-scale training, and deployment.
I received my PhD (2022) from Université Paris-Saclay, awarded the Plateau de Saclay Doctoral Prize, an M.Sc. in Data Science (2018) from Paris-Saclay, an M.Sc. in CS (2017) from Sorbonne Université, and a B.Sc. in EE from HUST.
News
Selected Publications
Background
My current research centers on making large language models simultaneously more capable and more efficient. On the architecture side, I explore hybrid designs that combine Transformer attention with State Space Models, achieving better quality–throughput trade-offs than either paradigm alone. On the data side, I focus on principled data curation, synthetic data generation, and (anti-) curriculum strategies that improve training efficiency at every scale — from 90M to 34B parameters. This dual focus on architecture and data is the thread connecting Falcon-H1, Falcon-Mamba, and our ongoing work.