Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/107875
PIRA download icon_1.1View/Download Full Text
Title: BeLLM : backward dependency enhanced large language model for sentence embeddings
Authors: Li, X 
Li, J 
Issue Date: 2024
Source: In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), p. 792–804, Mexico City, Mexico. Association for Computational Linguistics
Abstract: Sentence embeddings are crucial in measuring semantic similarity. Most recent studies employed large language models (LLMs) to learn sentence embeddings. Existing LLMs mainly adopted autoregressive architecture without explicit backward dependency modeling. Therefore, we examined the effects of backward dependencies in LLMs for semantic similarity measurements. Concretely, we propose a novel model: backward dependency enhanced large language model (BeLLM). It learns sentence embeddings via transforming specific attention layers from uni- to bi-directional. We extensively experiment across various semantic textual similarity (STS) tasks and downstream applications. BeLLM achieves state-of-the-art performance in varying scenarios. It shows that autoregressive LLMs benefit from backward dependencies for sentence embeddings.
Publisher: Association for Computational Linguistics (ACL)
ISBN: 979-8-89176-114-8
Description: The 2024 Conference of the North American Chapter of the Association for Computational Linguistics, June 16-21, 2024, Mexico City
Rights: © 2024 Association for Computational Linguistics
Materials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/).
The following publication Xianming Li and Jing Li. 2024. BeLLM: Backward Dependency Enhanced Large Language Model for Sentence Embeddings. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 792–804, Mexico City, Mexico. Association for Computational Linguistics is available at https://aclanthology.org/2024.naacl-long.45/.
Appears in Collections:Conference Paper

Files in This Item:
File Description SizeFormat 
2024.naacl-long.45.pdf789.83 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

6
Citations as of Jul 21, 2024

Downloads

1
Citations as of Jul 21, 2024

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.