My research blends linguistics, artificial intelligence, and complex systems science to understand how humans create and share meaning through language and narrative.
Languages are living ecosystems that emerge through cultural interaction, enabling us to construct shared worlds. I examine how they function across scales — from local conversations to population dynamics — and build computational models and agent-based simulations to uncover the principles behind their evolution.
Constructions are the fundamental units of language — patterns where form and meaning intertwine. I explore how these constructions interact to create larger expressions that let us cooperate, communicate, and build common ground.
Learn more
Languages are complex adaptive systems shaped by interaction and cultural evolution. I investigate how local interactions scale up to population-level conventions, and how these conventions feed back to shape individual linguistic behavior.
Read more
To study languages as living systems, we must act like gardeners — cultivating environments where linguistic structures can grow and evolve. I build computational environments that allow these dynamics to be modelled.
Read moreFor much of modern linguistics, form and meaning have lived in separate worlds: one of syntactic rules, and the other of meaning — touching only at the interfaces: syntax first generates well-formed structures, which are then handed off to a separate system for semantic interpretation.
But what if grammar isn't an autonomous system at all, but already steeped in meaning—down to its smallest patterns? This was Charles Fillmore's dangerous idea: that all of language can be described as mappings between form and function called constructions.
Dangerous not because it is rejected by scientists who hold rivalling views, but because it even meets resistance from the scholars who are inspired by it—including some of Fillmore's closest collaborators.
In my work, I explore what makes Fillmore's constructional idea so powerful, and how it could forever change the study of language—if only we are willing to take it to its logical conclusion. Because constructions aren't just patterns: they come with conditions for use and interpretation. Like local experts, they autonomously "know" when a pattern applies, how it functions in context, and how it connects to the rest of the system.
Once we take that seriously, nothing about language study looks quite the same.
The constructional approach isn't just another variant of doing linguistics—it changes how we understand meaning-making itself. Every grammatical pattern, from the simplest word to complex clausal structures, is a tool for framing experience. When we say "the cat chased the mouse," we're not just reporting facts—we're directing attention, establishing perspective, creating a narrative.
This matters profoundly for artificial intelligence. Current language models excel at pattern recognition but struggle with meaning construction. To build AI systems that truly understand and produce language, we need to model not just what forms occur, but what those forms do—how they frame, emphasize, and invite interpretation.
Constructions and Frames, Vol. 16(2), 2024, pp. 311-345
This invited article, part of a special issue on the future of Construction Grammar, argues that linguistics is evolving from the traditional view of languages as static, idealized entities (the “aggregate” perspective) to the view of language as a complex adaptive system (the “population” perspective). It shows why constructions are crucial for making this new kind of linguistics possible.
Self-Organization, Linguistic Selection, and Shared Worlds
Language is both structured and fluid, conventional yet variable, stable but always evolving. Explaining this dynamic equilibrium between stability and flexibility remains one of the great puzzles of the cognitive and behavioural sciences; and its resolution has deep consequences for how we build intelligent technologies.
A common assumption in both linguistics and language engineering is that the language of the community simply mirrors the language of the individual mind. In reality, every language user brings their own attentional habits, memory traces, processing constraints, and interactional histories to each communicative encounter. By glossing over these partially overlapping but never identical trajectories of language use, we reduce language to an "aggregate" object — a static system that conveniently ignores local variation.
My work explores language as a living ecology of form and meaning instead — an evolving system whose conventions emerge, spread and adapt through local interactions.
Languages are not centrally designed systems; their structure arises through self-organization. Patterns emerge spontaneously as the side-effect of how language users repeatedly coordinate on forms and meanings in countless local interactions. These micro-level adjustments accumulate, giving rise to stable conventions without any single individual needing a central authority to oversee the process.
Not all linguistic variants thrive equally. Some are easier to process, more memorable, or better suited to recurrent communicative needs. Through linguistic selection — a process similar to natural selection but operating at the cultural level — conventions that offer these advantages are more likely to be adopted and propagated, while less efficient ones fade away. Over time, this differential uptake shapes the population's linguistic repertoire.
The result of these intertwined processes is the emergence of shared linguistic worlds: dynamic but stable systems of conventions that allow communities to coordinate meaning. These shared worlds are neither imposed from above nor reducible to any one language user's knowledge — they are collectively forged and continuously negotiated.
Seeing language as a self-organizing population system reframes how we study both communication and cognition. It shows that linguistic structure is not simply learned or designed, but emerges from countless acts of coordination across time. This perspective bridges cognitive science and AI: rather than training models to imitate linguistic output, we can design systems that participate in the same adaptive processes—learning, varying, and stabilizing meanings through interaction. Understanding how languages evolve may thus hold the key to creating adaptive technologies that evolve with us.
Language Science Press, Berlin. 2016
Open-access book that explores the emergence of argument structure constructions and case marking through agent-based experiments in cultural language evolution.
Language exists in the wild. It emerges from the interactions of language users negotiating shared worlds—unplanned, unfinished, radically adaptive. But to understand that system scientifically, we need spaces where ideas can grow under conditions we can observe and share.
A formalism doesn't replace the wild. It creates a cultivated ecosystem: a kind of community garden where constructional dynamics can be planted, tested, pruned, and evolved.
Fluid Construction Grammar (FCG) is one such ecosystem. Developed over two decades of collaborative research, FCG offers a computational platform for exploring the constructional idea. As one of its core developers, I have made my share of contributions, but it has always been shared soil: tended by many, and open to new growth. By making our technologies open-source, we are inviting you into the garden.
The source code of FCG is embedded in the Babel Toolkit, a multi-agent framework for studying language evolution. Together, they enable researchers to model both individual linguistic processing and population-level dynamics, bridging the gap between cognitive linguistics and complex systems science.
Here's a short video explaining why FCG matters for deep modeling. Language is extraordinarily complex — not just in form, but in what we express and understand through it. Most technologies skim the surface; this video explains why we need formal tools like FCG if we want to engage truly with the richness of meaning. The video was recorded before the explosion of large language models, but its core message remains: if we care about interpretation, transparency, and cognitive insight, we need to dig deeper than observable form.
PLOS ONE, Vol. 17(6), 2022, e0269708
Co-authored with Katrien Beuls and Paul Van Eecke. Introduces the FCG Editor, providing researchers with an interactive environment for developing and testing construction grammars computationally.
FCG is free, open-source, and available for download. Whether you're a linguist interested in computational modeling or an AI researcher exploring neurosymbolic approaches, FCG provides a robust platform for your work.
"Your stylistics is better than your linguistics." — A professor at the University of Antwerp
He meant it as a compliment, but it felt like a warning to major in literature instead. In hindsight, it's probably the most accurate review I've ever received. I was more interested in how language evokes meaning and emotion than in dissecting it into tree diagrams.
And yet, a year later I found myself in computational linguistics. I loved literature, but I had always wanted to do computer science as well. One seduced me with metaphors and narratives, the other with systems, structure, and the thrill of building things.
The turning point: In Walter Daelemans' AI class, I encountered Luc Steels' Talking Heads experiment. His robotic agents weren't just parsing text — they were truly communicating about the world. They invented words, negotiated meaning, self-organized their community language. It felt alive.
That moment opened a new vision of language to me. Not as a formal system of rules, but as a living ecology of form and meaning. More than twenty years later, I'm still exploring that ecology at the Sony Computer Science Laboratories in Paris.
Selected works on the constructional idea, language evolution, and computational modelling.
Constructions, Vol. 17, 2025
An exploration of how Charles Fillmore's constructional idea fits into historical linguistic thought.
Constructions and Frames, Vol. 16(2), 2024, pp. 311-345
This invited article, part of a special issue on the future of Construction Grammar, argues that linguistics is evolving from the traditional view of languages as static, idealized entities (the “aggregate” perspective) to the view of language as a complex adaptive system (the “population” perspective). It shows why constructions are crucial for making this new kind of linguistics possible.
PLOS ONE, Vol. 17(6), 2022, e0269708
Co-authored with Katrien Beuls and Paul Van Eecke. Introduces the FCG Editor, providing researchers with an interactive environment for developing and testing construction grammars computationally.
Language Dynamics and Change, 3(1), 2013, pp. 105-132.
Article that illustrates linguistic selection in action through a case study on German agreement.
Advances in Complex Systems, 15(3-4), 2012, 1250039.
Co-authored with Luc Steels. Explores how systematicity can be maintained in an emergent compositional language.
I welcome thoughtful collaborations, questions, or reflections on language, meaning, and artificial intelligence.