In authors or contributors
Field of study
Publication year

Novelty and Diversity in Information Retrieval Evaluation

Resource type
Authors/contributors
Title
Novelty and Diversity in Information Retrieval Evaluation
Abstract
Evaluation measures act as objective functions to be optimized by information retrieval systems. Such objective functions must accurately reflect user requirements, particularly when tuning IR systems and learning ranking functions. Ambiguity in queries and redundancy in retrieved documents are poorly reflected by current evaluation measures. In this paper, we present a framework for evaluation that systematically rewards novelty and diversity. We develop this framework into a specific evaluation measure, based on cumulative gain. We demonstrate the feasibility of our approach using a test collection based on the TREC question answering track.
Date
2008
Proceedings Title
Proceedings of the 31st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval
Place
New York, NY, USA
Publisher
ACM
Pages
659–666
Series
SIGIR '08
Language
en
ISBN
978-1-60558-164-4
Accessed
1/27/19, 7:15 PM
Library Catalog
ACM Digital Library
Citation
Clarke, C. L. A., Kolla, M., Cormack, G. V., Vechtomova, O., Ashkan, A., Büttcher, S., & MacKinnon, I. (2008). Novelty and Diversity in Information Retrieval Evaluation. Proceedings of the 31st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, 659–666. https://doi.org/10.1145/1390334.1390446
Field of study
Contribution