NMEIR 2008

From Openresearch
Jump to: navigation, search
NMEIR 2008
ECIR 2008 Workshop on novel methodologies for evaluation in information retrieval
Dates Mar 30, 2008 (iCal) - Apr 3, 2008
Homepage: ecir2008.dcs.gla.ac.uk/i
Location
Location: Glasgow, UK
Loading map...

Important dates
Submissions: Feb 4, 2008
Notification: Feb 20, 2008
Table of Contents


Event


CALL FOR PAPERS

Workshop on novel methodologies for evaluation in information retrieval
At ECIR, 2008


OBJECTIVES
Information retrieval is an empirical science; the field cannot move forward unless there are means of evaluating the innovations devised by researchers. However the methodologies conceived in the early years of IR and used in the campaigns of today are starting to show their age and new research is emerging to understand how to overcome the twin challenges of scale and diversity.

Scale
The methodologies used to build test collections in the modern evaluation campaigns were originally conceived to work with collections of 10s of thousands of documents. The methodologies were found to scale well, but potential flaws are starting to emerge as test collections grow beyond 10s of millions of documents. Support for continued research in this area is crucial if IR research is to continue to evaluate large scale search.

Diversity
With the rise of the large Web search engines, some believed that all search problems could be solved with a single engine retrieving from a one vast data store. However, it is increasingly clear that evolution of retrieval is not towards a monolithic solution, but instead to a wide range of solutions tailored for different classes of information and different groups of users or organizations. Each tailored system on offer requires a different mixture of component technologies combined in distinct ways and each solution requires evaluation.

This workshop calls for research papers (max 8 pages) to be submitted on topics that address evaluation in Information Retrieval. Topics will include but are not limited to:
�?� test collection building for diverse needs
�?� new metrics and methodologies
�?� evaluation of multilingual IR and/or multimedia IR systems
�?� novel evaluation of related areas, such as QA or summarization
�?� evaluation of commercial systems
�?� Novel forms of user-centered evaluation

Papers will be peer reviewed by members of the workshop Programme Committee. A preliminary list of the PC members is

Paul Clough              University of Sheffield
Franciska de Jong       University of Twente
Thomas Deselaers         RWTH Aachen University
Norbert Fuhr             University of Duisburg
Gareth Jones             Dublin City University
Jussi Karlgren           Swedish Institute of Computer Science
Bernardo Magnini         ITC-irst
Paul McNamee             Johns Hopkins University
Henning Müller           University & University Hospitals of Geneva
Stephen Robertson        Microsoft Research
Tetsuya Sakai            National Institute of Informatics


SUBMISSION
Papers will be submitted as PDFs in ACM SIG Proceedings format

         http://www.acm.org/sigs/publications/proceedings-templates

Submit final versions of paper to m.sanderson@shef.ac.uk



IMPORTANT DATES
Submission date Monday 4 February
Notifications by 20 February
Final copy by 3 March.


WORKSHOP ORGANISERS AND CONTACT DETAILS:
The Workshop Chair is Mark Sanderson. Co-organisers are Martin Braschler, Nicola Ferro and Julio Gonzalo.

The workshop will be sponsored by Treble-CLEF, a Coordination Action under 7FP which will promote R&D, evaluation and technology transfer in the multilingual information access domain.
	

This CfP was obtained from WikiCFP

Facts about "NMEIR 2008"
AcronymNMEIR 2008 +
End dateApril 3, 2008 +
Has coordinates55° 51' 40", -4° 14' 56"Latitude: 55.860983333333
Longitude: -4.2488777777778
+
Has location cityGlasgow +
Has location countryCategory:UK +
Homepagehttp://ecir2008.dcs.gla.ac.uk/i +
IsAEvent +
NotificationFebruary 20, 2008 +
Start dateMarch 30, 2008 +
Submission deadlineFebruary 4, 2008 +
TitleECIR 2008 Workshop on novel methodologies for evaluation in information retrieval +