SemEval 2019

From Openresearch
Jump to: navigation, search
SemEval 2019
International Workshop on Semantic Evaluation 2019
Event in series SemEval
Dates 2019/06/06 (iCal) - 2019/06/07
Homepage: http://alt.qcri.org/semeval2019/
Location
Location: Minneapolis, USA
Loading map...

Table of Contents


SemEval has evolved from the SensEval word sense disambiguation evaluation series. The SemEval wikipedia entry and the ACL SemEval Wiki provide a more detailed historical overview. SemEval-2019 will be the 13th workshop on semantic evaluation.

SemEval-2019 will be held June 6-7, 2019 in Minneapolis, USA, collocated with the Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT 2019).

Important Dates

Task Proposals:

  • 26 Mar 2018: Task proposals due
  • 04 May 2018: Task proposal notifications

Setup for the Competition: 20 Aug 2018: CodaLab competition website ready and made public. Should include basic task description and mailing group information for the task. Trial data ready. Evaluation script ready for participants to download and run on the trial data. 17 Sep 2018: Training data ready. Development data ready. CodaLab competition website updated to include an evaluation script uploaded as part of the competition so that participants can upload submissions on the development set and the script immediately checks the submission for format and computes the results on the development set. This is also the date by which a benchmark system should be made available to participants. Also, the organizers should run the submission created with the benchmark system on CodaLab, so that participants can see its results on the LeaderBoard.

Competition and Beyond:

  • 10 Jan 2019: Evaluation start*
  • 31 Jan 2019: Evaluation end*
  • 05 Feb 2019: Results posted
  • 28 Feb 2019: System and Task description paper submissions due by 23:59 GMT -12:00
  • 14 Mar 2019 Paper reviews due (for both systems and tasks)
  • 06 Apr 2019: Author notifications
  • 20 Apr 2019: Camera ready submissions due
  • Summer 2019: SemEval 2019
  • 10 Jan to 31 Jan 2019 is the period during which the task organizers must schedule the evaluation periods for their individual tasks. Usually, evaluation periods for individual tasks are 7 to 14 days, but there is no hard and fast rule about this. Contact the task organizers for the tasks you are interested in for the exact time frame when they will conduct their evaluations. They should tell you the date by which they will release the test data, and the date by which participant submissions are to be uploaded. Note that some tasks may involve more than one sub-task, each having a separate evaluation time frame.

Organizers

* Jonathan May, ISI, University of Southern California
* Ekaterina Shutova, University of Cambridge
* Aurelie Herbelot, University of Trento
* Xiaodan Zhu, Queen's University
* Marianna Apidianaki, LIMSI, CNRS, Université Paris-Saclay & University of Pennsylvania
* Saif M. Mohammad, National Research Council Canada
Facts about "SemEval 2019"
AcronymSemEval 2019 +
End dateJune 7, 2019 +
Event in seriesSemEval +
Event typeConference +
Has coordinates44° 58' 38", -93° 15' 56"Latitude: 44.9773
Longitude: -93.265469444444
+
Has location cityMinneapolis +
Has location countryCategory:USA +
Homepagehttp://alt.qcri.org/semeval2019/ +
IsAEvent +
Start dateJune 6, 2019 +
TitleInternational Workshop on Semantic Evaluation 2019 +