Change search
Link to record
Permanent link

Direct link
Michaud, Jérôme, DrORCID iD iconorcid.org/0000-0001-6194-1355
Alternative names
Publications (4 of 4) Show all publications
Enquist, M., Jansson, F., Ghirlanda, S. & Michaud, J. (2024). Cultural traits operating in senders are driving forces of cultural evolution. Proceedings of the Royal Society of London. Biological Sciences, 291(2018), Article ID 20232110.
Open this publication in new window or tab >>Cultural traits operating in senders are driving forces of cultural evolution
2024 (English)In: Proceedings of the Royal Society of London. Biological Sciences, ISSN 0962-8452, E-ISSN 1471-2954, Vol. 291, no 2018, article id 20232110Article in journal (Refereed) Published
Abstract [en]

We introduce a mathematical model of cultural evolution to study cultural traits that shape how individuals exchange information. Current theory focuses on traits that influence the reception of information (receiver traits), such as evaluating whether information represents the majority or stems from a trusted source. Our model shifts the focus from the receiver to the sender of cultural information and emphasizes the role of sender traits, such as communicability or persuasiveness. Here, we show that sender traits are probably a stronger driving force in cultural evolution than receiver traits. While receiver traits evolve to curb cultural transmission, sender traits can amplify it and fuel the self-organization of systems of mutually supporting cultural traits, including traits that cannot be maintained on their own. Such systems can reach arbitrary complexity, potentially explaining uniquely human practical and mental skills, goals, knowledge and creativity, independent of innate factors. Our model incorporates social and individual learning throughout the lifespan, thus connecting cultural evolutionary theory with developmental psychology. This approach provides fresh insights into the trait-individual duality, that is, how cultural transmission of single traits is influenced by individuals, who are each represented as an acquired system of cultural traits.

Keywords
cultural evolution, cultural transmission, cumulative culture, dynamical systems, trait-individual duality, developmental psychology
National Category
Peace and Conflict Studies Other Social Sciences not elsewhere specified Psychology (excluding Applied Psychology) Evolutionary Biology
Research subject
Psychology
Identifiers
urn:nbn:se:su:diva-227521 (URN)10.1098/rspb.2023.2110 (DOI)001183512400006 ()38471552 (PubMedID)2-s2.0-85187799771 (Scopus ID)
Funder
Marianne and Marcus Wallenberg Foundation, 2021.0039
Available from: 2024-03-18 Created: 2024-03-18 Last updated: 2025-02-20Bibliographically approved
Jon-And, A. & Michaud, J. (2024). Emergent grammar from a minimal cognitive architecture. In: Nölle, J. and Raviv, L. and Graham, K. E. and Hartmann, S. and Jadoul, Y. and Josserand, M. and Matzinger, T. and Mudd, K. and Pleyer, M. and Slonimska, A. and Wacewicz, S. and Watson, S. (Ed.), The Evolution of Language: Proceedings of the 15th International Conference (Evolang XV). Paper presented at 15th International Conference on the Evolution of Language (EVOLANG XV), Madison, WI, USA, May 18–21, 2024.
Open this publication in new window or tab >>Emergent grammar from a minimal cognitive architecture
2024 (English)In: The Evolution of Language: Proceedings of the 15th International Conference (Evolang XV) / [ed] Nölle, J. and Raviv, L. and Graham, K. E. and Hartmann, S. and Jadoul, Y. and Josserand, M. and Matzinger, T. and Mudd, K. and Pleyer, M. and Slonimska, A. and Wacewicz, S. and Watson, S., 2024Conference paper, Published paper (Refereed)
Abstract [en]

In this paper, we introduce a minimal cognitive architecture designed to explore the mechanisms underlying human language learning abilities. Our model inspired by research in artificial intelligence incorporates sequence memory, chunking and schematizing as key domain-general cognitive mechanisms. It combines an emergentist approach with the generativist theory of type systems. By modifying the type system to operationalize theories on usage-based learning and emergent grammar, we build a bridge between theoretical paradigms that are usually considered incompatible. Using a minimal error-correction reinforcement learning approach, we show that our model is able to extract functional grammatical systems from limited exposure to small artificial languages. Our results challenge the need for complex predispositions for language and offer a promising path for further development in understanding cognitive prerequisites for language and the emergence of grammar during learning.

Keywords
cognitive architecture, sequence memory, language learning, language evolution
National Category
General Language Studies and Linguistics
Research subject
Linguistics
Identifiers
urn:nbn:se:su:diva-232813 (URN)10.17617/2.3587960 (DOI)
Conference
15th International Conference on the Evolution of Language (EVOLANG XV), Madison, WI, USA, May 18–21, 2024
Funder
Swedish Research Council, 2022-02737
Available from: 2024-08-26 Created: 2024-08-26 Last updated: 2025-01-03Bibliographically approved
Jon-And, A. & Michaud, J. (2024). Usage-based Grammar Induction from Minimal Cognitive Principles. Computational linguistics - Association for Computational Linguistics (Print), 50(4), 1375-1414
Open this publication in new window or tab >>Usage-based Grammar Induction from Minimal Cognitive Principles
2024 (English)In: Computational linguistics - Association for Computational Linguistics (Print), ISSN 0891-2017, E-ISSN 1530-9312, Vol. 50, no 4, p. 1375-1414Article in journal (Refereed) Published
Abstract [en]

This study explores the cognitive mechanisms underlying human language acquisition through grammar induction by a minimal cognitive architecture, with a short and flexible sequence memory as its most central feature. We use reinforcement learning for the task of identifying sentences in a stream of words from artificial languages. Results demonstrate the model’s ability to identify frequent and informative multi-word chunks, reproducing characteristics of natural language acquisition. The model successfully navigates varying degrees of linguistic complexity, exposing efficient adaptation to combinatorial challenges through the reuse of sequential patterns. The emergence of parsimonious tree structures suggests an optimization for the sentence identification task, balancing economy and information. The cognitive architecture reflects aspects of human memory systems and decision-making processes, enhancing its cognitive plausibility. While the model exhibits limitations in generalization and semantic representation, its minimalist nature offers insights into some fundamental mechanisms of language learning. Our study demonstrates the power of this simple architecture and stresses the importance of sequence memory in language learning. Since other animals do not seem to have faithful sequence memory, this may be a key to understanding why only humans have developed complex languages.

Keywords
Grammar induction, cognitive architecture, usage-based learning, sequence representation
National Category
Comparative Language Studies and Linguistics
Research subject
Linguistics; Computational Linguistics
Identifiers
urn:nbn:se:su:diva-241705 (URN)10.1162/coli_a_00528 (DOI)001381505900006 ()
Available from: 2025-04-03 Created: 2025-04-03 Last updated: 2025-04-04Bibliographically approved
Jon-And, A. & Michaud, J. (2020). Minimal Prerequisits for Processing Language Structure: A Model Based on Chunking and Sequence Memory. In: Andrea Ravignani; Chiara Barbieri; Molly Flaherty; Yannick Jadoul; Ella Lattenkamp (Ed.), : . Paper presented at The 13th International Conference on the Evolution of Language (EvoLang13), Brussels, Belgium, April 14-17, 2020.
Open this publication in new window or tab >>Minimal Prerequisits for Processing Language Structure: A Model Based on Chunking and Sequence Memory
2020 (English)In: / [ed] Andrea Ravignani; Chiara Barbieri; Molly Flaherty; Yannick Jadoul; Ella Lattenkamp, 2020Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]

In this paper, we address the question of what minimal cognitive features are necessary for learning to process and extract grammatical structure from language. We build a minimalistic computational model containing only the two core features chunking and sequence memory and test its capacity to identify sentence borders and parse sentences in two artificial languages. The model has no prior linguistic knowledge and learns only by reinforcement of the identification of meaningful units. In simulations, the model turns out to be successful at its tasks, indicating that it is a good starting point for an extended model with ability to process and extract grammatical structure from larger corpora of natural language. We conclude that a model with the features chunking and sequence memory, that should in the future be complemented with the ability to establish hierarchical schemas, has the potential of describing the emergence of grammatical categories through language learning.

National Category
General Language Studies and Linguistics
Research subject
General Linguistics
Identifiers
urn:nbn:se:su:diva-227645 (URN)
Conference
The 13th International Conference on the Evolution of Language (EvoLang13), Brussels, Belgium, April 14-17, 2020
Available from: 2024-03-24 Created: 2024-03-24 Last updated: 2024-04-02Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0001-6194-1355

Search in DiVA

Show all publications