Humans Learn Language from Situated Communicative Interactions. What about Machines?

Katrien Beuls, Paul Van Eecke

Research output: Contribution to journalArticlepeer-review

48 Downloads (Pure)

Abstract

Humans acquire their native languages by taking part in communicative interactions with their caregivers. These interactions are meaningful, intentional, and situated in their everyday environment. The situated and communicative nature of the interactions is essential to the language acquisition process, as language learners depend on clues provided by the communicative environment to make sense of the utterances they perceive. As such, the linguistic knowledge they build up is rooted in linguistic forms, their meaning, and their communicative function. When it comes to machines, the situated, communicative, and interactional aspects of language learning are often passed over. This applies in particular to today’s large language models (LLMs), where the input is predominantly text-based, and where the distribution of character groups or words serves as a basis for modeling the meaning of linguistic expressions. In this article, we argue that this design choice lies at the root of a number of important limitations, in particular regarding the data hungriness of the models, their limited ability to perform human-like logical and pragmatic reasoning, and their susceptibility to biases. At the same time, we make a case for an alternative approach that models how artificial agents can acquire linguistic structures by participating in situated communicative interactions. Through a selection of experiments, we show how the linguistic knowledge that is captured in the resulting models is of a fundamentally different nature than the knowledge captured by LLMs and argue that this change of perspective provides a promising path towards more human-like language processing in machines.
Original languageEnglish
Pages (from-to)1277–1311
Number of pages35
JournalComputational Linguistics
Volume50
Issue number4
Early online dateAug 2024
DOIs
Publication statusPublished - Dec 2024

Funding

We would like to thank J\u00E9r\u00F4me Botoko Ekila and Jens Nevens for their assistance in creating the graphs shown in and . We are grateful to Marie-Catherine de Marneffe and Remi van Trijp for their role in the discussions that led up to the writing of this article, and to Lara Verheyen, Remi van Trijp, and three anonymous reviewers for their insightful feedback on earlier versions of the manuscript. The research reported on in this article was funded by the European Union\u2019s Horizon 2020 research and innovation programme under grant agreement no. 951846, the Flemish Government under the Onderzoeksprogramma Artifici\u00EBle Intelligentie (AI) Vlaanderen programme, the AI Flagship project ARIAC by DigitalWallonia4.ai, and the F.R.S.-FNRS-FWO WEAVE project HERMES I under grant numbers T002724F (F.R.S.-FNRS) and G0AGU24N (FWO).

FundersFunder number
Vlaamse regering
Fonds Wetenschappelijk Onderzoek
Onderzoeksprogramma Artificiële Intelligentie
Horizon 2020 Framework Programme951846
Fonds De La Recherche Scientifique - FNRST002724F, G0AGU24N

    Keywords

    • computational linguistics
    • construction grammar
    • LLMs

    Fingerprint

    Dive into the research topics of 'Humans Learn Language from Situated Communicative Interactions. What about Machines?'. Together they form a unique fingerprint.

    Cite this