Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/72412
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Chinese and Bilingual Studiesen_US
dc.creatorChersoni, Een_US
dc.creatorSantus, Een_US
dc.creatorLenci, Aen_US
dc.creatorBlache, Pen_US
dc.creatorHuang, CRen_US
dc.date.accessioned2018-01-31T07:28:17Z-
dc.date.available2018-01-31T07:28:17Z-
dc.identifier.urihttp://hdl.handle.net/10397/72412-
dc.language.isoenen_US
dc.publisherAssociation for Computational Linguisticsen_US
dc.rightsACL materials are Copyright © 1963–2021 ACL; other materials are copyrighted by their respective copyright holders. Materials prior to 2016 here are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 International License (https://creativecommons.org/licenses/by-nc-sa/3.0/). Permission is granted to make copies for the purposes of teaching and research. Materials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/).en_US
dc.rightsThe following publication Chersoni, E., Santus, E., Lenci, A., Blache, P., & Huang, C. R. (2016). Representing verbs with rich contexts: an evaluation on verb similarit. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing (pp. 1967-1972) is available at https://aclanthology.org/D16-1205en_US
dc.titleRepresenting verbs with rich contexts : an evaluation on verb similarityen_US
dc.typeConference Paperen_US
dc.identifier.spage1967en_US
dc.identifier.epage1972en_US
dcterms.abstractSeveral studies on sentence processing suggest that the mental lexicon keeps track of the mutual expectations between words. Current DSMs, however, represent context words as separate features, thereby loosing important information for word expectations, such as word interrelations. In this paper, we present a DSM that addresses this issue by defining verb contexts as joint syntactic dependencies. We test our representation in a verb similarity task on two datasets, showing that joint contexts achieve performances comparable to single dependencies or even better. Moreover, they are able to overcome the data sparsity problem of joint feature spaces, in spite of the limited size of our training corpus.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationProceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, Austin, Texas, November 1-5 2016, p. 1967-1972en_US
dcterms.issued2016-
dc.identifier.ros2016003192-
dc.relation.ispartofbookProceedings of the 2016 Conference on Empirical Methods in Natural Language Processingen_US
dc.relation.conferenceConference on Empirical Methods in Natural Language Processing [EMNLP]en_US
dc.identifier.rosgroupid2016003127-
dc.description.ros2016-2017 > Academic research: refereed > Refereed conference paperen_US
dc.description.validatebcwhen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumbera0670-n03, CBS-0397en_US
dc.description.fundingSourceSelf-fundeden_US
dc.description.pubStatusPublisheden_US
dc.identifier.OPUS21045315en_US
Appears in Collections:Conference Paper
Files in This Item:
File Description SizeFormat 
D16-1205.pdf114.55 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

111
Last Week
1
Last month
Citations as of May 19, 2024

Downloads

18
Citations as of May 19, 2024

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.