Details

Using the Correct MAP to Find Success: A Case Study of One Company's Enterprise Search Evaluation Metric.

by McClarnon, Maureen T.

Abstract (Summary)
Information retrieval services can be difficult to evaluate because relevance is difficult to define, much less quantify. Mean Average Precision , a measure popularized by the Text Retrieval Conferences of the past dozen-plus years, on relevance measures defined by recall and precision, but omits valuable information about system performance in its averaging of individual topic precision. A measure is needed, especially in the corporate context, where pressure to optimize enterprise search system performance is heavy—the real challenge is finding the best one. This paper analyzes the use of a variation of Mean Average Precision and evaluates its appropriateness as a metric for measuring the success rate of at SomaPharm, Inc., a pharmaceutical company’s enterprise (intranet) search service. The results indicate that the current metric worked well for some topics, but that the averaging properties and the threshold component were not as effective for other topics, and did not adequately represent users’ needs.
Bibliographical Information:

Advisor:Jane Greenberg

School:University of North Carolina at Chapel Hill

School Location:USA - North Carolina

Source Type:Master's Thesis

Keywords:information retrieval evaluation case studies mean average precision relevance judgments text conference

ISBN:

Date of Publication:11/22/2005

© 2009 OpenThesis.org. All Rights Reserved.