Crowdsourcing for IR: experimentation guidelines, challenges & opportunities

Staff - Faculty of Informatics

Start date: 28 September 2009

End date: 29 September 2009

The Faculty of Informatics is pleased to announce a seminar given by Dr. Omar Alonso

DATE: Monday, September 28th, 2009
PLACE
: USI Università della Svizzera italiana, room SI-006, Informatics building (Via G. Buffi 13)
TIME: 15.30

ABSTRACT:
In Information Retrieval (IR) and Web search, relevance is a central notion. Relevance evaluation is an essential part of the development and maintenance of IR systems. Traditional evaluation approaches have several limitations and can be very expensive. Crowdsourcing has emerged as a new trend, in which many online users, drawn from a large community each performs a small evaluation task. I will discuss how to outsource to the crowd different relevance-related tasks. I will show the findings of three different projects that cover different aspects of relevance: criteria, document snippets, and TREC topics. The commonality of this approach is the usage of Amazon Mechanical Turk (AMT) as a crowdsoucing platform to conduct different experiments. I will present challenges and open problems when designing such experiments. I will give an overview of some of the crowdsourcing efforts at MPI. Finally, I will present a database perspective on the crowdsourcing paradigm.

HOST: Prof. Fabio Crestani