Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/106697
PIRA download icon_1.1View/Download Full Text
Title: Neural generative models and the parallel architecture of language : a critical review and outlook
Authors: Rambelli, G
Chersoni, E 
Testa, D
Blache, P
Lenci, A
Issue Date: 2024
Source: Topics in cognitive science, First published: 18 April 2024, Early View, https://doi.org/10.1111/tops.12733
Abstract: According to the parallel architecture, syntactic and semantic information processing are two separate streams that interact selectively during language comprehension. While considerable effort is put into psycho- and neurolinguistics to understand the interchange of processing mechanisms in human comprehension, the nature of this interaction in recent neural Large Language Models remains elusive. In this article, we revisit influential linguistic and behavioral experiments and evaluate the ability of a large language model, GPT-3, to perform these tasks. The model can solve semantic tasks autonomously from syntactic realization in a manner that resembles human behavior. However, the outcomes present a complex and variegated picture, leaving open the question of how Language Models could learn structured conceptual representations.
Keywords: Enriched composition
GPT-3 prompting
Neural large language models
Parallel architecture
Semantic composition
Statistical learning
Syntax-semantics interface
Publisher: Wiley-Blackwell Publishing, Inc.
Journal: Topics in cognitive science 
ISSN: 1756-8757
EISSN: 1756-8765
DOI: 10.1111/tops.12733
Research Data: https://osf.io/c7f4u/?view_only=201206d3d8574e9b84429419be4587e7
Rights: © 2024 The Authors. Topics in Cognitive Science published by Wiley Periodicals LLC on behalf of Cognitive Science Society
This is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs License (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.
The following publication Rambelli, G., Chersoni, E., Testa, D., Blache, P. and Lenci, A. (2024), Neural Generative Models and the Parallel Architecture of Language: A Critical Review and Outlook. Top. Cogn. Sci. is available at https://doi.org/10.1111/tops.12733.
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
Rambelli_Neural_Generative_Models.pdf459.28 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

2
Citations as of Jun 30, 2024

Downloads

3
Citations as of Jun 30, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.