Zur Kurzanzeige

dc.contributor.authorSchulz, Claudia
dc.contributor.authorMeyer, Christian M.
dc.contributor.authorKiesewetter, Jan
dc.contributor.authorSailer, Michael
dc.contributor.authorBauer, Elisabeth
dc.contributor.authorFischer, Martin R.
dc.contributor.authorFischer, Frank
dc.contributor.authorGurevych, Iryna
dc.date.accessioned2019-05-28T12:49:45Z
dc.date.available2019-05-28T12:49:45Z
dc.date.issued2019-07
dc.identifier.urihttps://tudatalib.ulb.tu-darmstadt.de/handle/tudatalib/2001
dc.descriptionMany complex discourse-level tasks can aid domain experts in their work but require costly expert annotations for data creation. To speed up and ease annotations, we investigate the viability of automatically generated annotation suggestions for such tasks. As an example, we choose a task that is particularly hard for both humans and machines: the segmentation and classification of epistemic activities in diagnostic reasoning texts. We create and publish a new dataset covering two domains and carefully analyse the suggested annotations. We find that suggestions have positive effects on annotation speed and performance, while not introducing noteworthy biases. Envisioning suggestion models that improve with newly annotated texts, we contrast methods for continuous model adjustment and suggest the most effective setup for suggestions in future expert tasks.en_US
dc.language.isodeen_US
dc.rightsin Copyright
dc.rights.urihttps://rightsstatements.org/vocab/InC/1.0/
dc.subjectDiagnostic Reasoningen_US
dc.subjectAnnotation Suggestionen_US
dc.subjectInteractive Machine Learningen_US
dc.subjectFAMULUSen_US
dc.subject.ddc000 Informatik, Informationswissenschaft, allgemeine Werkeen_US
dc.titleAnalysis of Automatic Annotation Suggestions for Hard Discourse-Level Tasks in Expert Domainsen_US
dc.typeTexten_US
tud.tubiblio113668en_US
tud.tubiblio109926


Dateien zu dieser Ressource

Thumbnail

Der Datensatz erscheint in:

Zur Kurzanzeige

in Copyright
Solange nicht anders angezeigt, wird die Lizenz wie folgt beschrieben: in Copyright
VersionDatensatzVersionsbeschreibungDatumZusammenfassung

* Ausgewählte Version