Online citations, reference lists, and bibliographies.
← Back to Search

Crowdsourcing For Relevance Evaluation

Omar Alonso, Daniel E. Rose, Benjamin Stewart

Save to my Library
Download PDF
Analyze on Scholarcy
Share
Relevance evaluation is an essential part of the development and maintenance of information retrieval systems. Yet traditional evaluation approaches have several limitations; in particular, conducting new editorial evaluations of a search system can be very expensive. We describe a new approach to evaluation called TERC, based on the crowdsourcing paradigm, in which many online users, drawn from a large community, each performs a small evaluation task.