SemEval 2020

From Openresearch
Jump to: navigation, search
SemEval 2020
International Workshop on Semantic Evaluation 2020
Event in series SemEval
Dates 2020/12/12 (iCal) - 2020/12/13
Homepage: http://alt.qcri.org/semeval2020/
Location
Location: Barcelona, Spain
Loading map...

Committees
Organizers: Aurelie Herbelot, Xiaodan Zhu, Nathan Schneider, Alexis Palmer, Jonathan May, Ekaterina Shutova
Table of Contents


International Workshop on Semantic Evaluation 2020

  • NEW: Task proposal submission deadline extended to April 3, 2020
  • NEW: Paper submission deadlines extended to May 15 (system description) and May 22 (task descripti

on)

SemEval has evolved from the SensEval word sense disambiguation evaluation series. The SemEval wikipedia entry and the ACL SemEval Wiki provide a more detailed historical overview. SemEval-2020 will be the 14th workshop on semantic evaluation.

SemEval-2020 will be held December 12-13, 2020 in Barcelona, Spain, collocated with The 28th International Conference on Computational Lingustics (COLING-2020).

Important Dates

  • Task proposals due: April 3, 2020 (UTC-12) -- updated March 19, 2020
  • Task selection notification: May 25, 2020

Setup for the Competition:

  • 31 July 2019: CodaLab competition website ready and made public. Should include basic task description and mailing group information for the task. Trial data ready. Evaluation script ready for participants to download and run on the trial data.
  • 4 September 2019: Training data ready. Development data ready. CodaLab competition website updated to include an evaluation script uploaded as part of the competition. Participants should be able to upload submissions on the development set, and the script will immediately check the submission for format and compute results on the development set. This is also the date by which a benchmark system should be made available to participants. Also, the organizers should run the submission created with the benchmark system on CodaLab, so that participants can see its results on the LeaderBoard.

Competition and Beyond:

  • 19 February 2020: Evaluation start*
  • 11 March 2020: Evaluation end*
  • 18 March 2020: Results posted
  • 15 May 2020: System description paper submissions due (11:59pm, UTC-12)
  • 22 May 2020: Task description paper submissions due (11:59pm, UTC-12)
  • 24 Jun 2020: Author notifications
  • 8 Jul 2020: Camera ready submissions due (11:59pm, UTC-12)
  • 12-13 December 2020: SemEval 2020
  • 19 February to 11 March 2020 is the period during which the task organizers must schedule the evaluation periods for their individual tasks. Usually, evaluation periods for individual tasks are 7 to 14 days, but there is no hard and fast rule about this. Contact the task organizers for the tasks you are interested in for the exact time frame when they will conduct their evaluations. They should tell you the date by which they will release the test data, and the date by which participant submissions are to be uploaded. Note that some tasks may involve more than one sub-task, each having a separate evaluation time frame.

Organizers

  • Aurelie Herbelot, University of Trento
  • Xiaodan Zhu, Queen's University
  • Nathan Schneider, Georgetown University
  • Alexis Palmer, University of North Texas
  • Jonathan May, ISI, University of Southern California
  • Ekaterina Shutova, University of Amsterdam
Facts about "SemEval 2020"
AcronymSemEval 2020 +
End dateDecember 13, 2020 +
Event in seriesSemEval +
Event typeConference +
Has coordinates41° 22' 58", 2° 10' 39"Latitude: 41.382894444444
Longitude: 2.1774333333333
+
Has coordinatorAurelie Herbelot +, Xiaodan Zhu +, Nathan Schneider +, Alexis Palmer +, Jonathan May + and Ekaterina Shutova +
Has location cityBarcelona +
Has location countryCategory:Spain +
Homepagehttp://alt.qcri.org/semeval2020/ +
IsAEvent +
Start dateDecember 12, 2020 +
TitleInternational Workshop on Semantic Evaluation 2020 +