Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/62391
Title: A context-dependent relevance model
Authors: Dang, E
Luk, RWP 
Allan, J
Keywords: Information retrieval
Issue Date: 2016
Publisher: John Wiley & Sons
Source: Journal of the Association for Information Science and Technology, 2016, v. 67, no. 3, p. 582-593 How to cite?
Journal: Journal of the Association for Information Science and Technology 
Abstract: Numerous past studies have demonstrated the effectiveness of the relevance model (RM) for information retrieval (IR). This approach enables relevance or pseudo-relevance feedback to be incorporated within the language modeling framework of IR. In the traditional RM, the feedback information is used to improve the estimate of the query language model. In this article, we introduce an extension of RM in the setting of relevance feedback. Our method provides an additional way to incorporate feedback via the improvement of the document language models. Specifically, we make use of the context information of known relevant and nonrelevant documents to obtain weighted counts of query terms for estimating the document language models. The context information is based on the words (unigrams or bigrams) appearing within a text window centered on query terms. Experiments on several Text REtrieval Conference (TREC) collections show that our context-dependent relevance model can improve retrieval performance over the baseline RM. Together with previous studies within the BM25 framework, our current study demonstrates that the effectiveness of our method for using context information in IR is quite general and not limited to any specific retrieval model.
URI: http://hdl.handle.net/10397/62391
ISSN: 2330-1635
EISSN: 2330-1643
DOI: 10.1002/asi.23419
Appears in Collections:Journal/Magazine Article

Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

24
Last Week
2
Last month
Checked on Aug 13, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.