Experts ou (foule de) non-experts ? la question de l’expertise des annotateurs vue de la myriadisation (crowdsourcing)
Experts or (crowd of) non-experts ? the question of the annotators’ expertise viewed from crowdsourcing.Manual corpus annotation is more and more performed using crowdsourcing : produced by a crowd of persons, through the Web, for free, or for a very small remuneration. Our experiments question the...
Saved in:
| Main Author: | |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Cercle linguistique du Centre et de l'Ouest - CerLICO
2017-02-01
|
| Series: | Corela |
| Subjects: | |
| Online Access: | https://journals.openedition.org/corela/4835 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850125090852700160 |
|---|---|
| author | Karën Fort |
| author_facet | Karën Fort |
| author_sort | Karën Fort |
| collection | DOAJ |
| description | Experts or (crowd of) non-experts ? the question of the annotators’ expertise viewed from crowdsourcing.Manual corpus annotation is more and more performed using crowdsourcing : produced by a crowd of persons, through the Web, for free, or for a very small remuneration. Our experiments question the commonly accepted vision : crowdsourcing a task is not having a crowd of non-experts performing it, but rather identifying experts of the (annotation) task in the crowd. Those experiments therefore contribute to the reflection on the corpus annotators’ expertise. |
| format | Article |
| id | doaj-art-59bfd07652bf49e4ba767f028450f86b |
| institution | OA Journals |
| issn | 1638-573X |
| language | English |
| publishDate | 2017-02-01 |
| publisher | Cercle linguistique du Centre et de l'Ouest - CerLICO |
| record_format | Article |
| series | Corela |
| spelling | doaj-art-59bfd07652bf49e4ba767f028450f86b2025-08-20T02:34:10ZengCercle linguistique du Centre et de l'Ouest - CerLICOCorela1638-573X2017-02-012110.4000/corela.4835Experts ou (foule de) non-experts ? la question de l’expertise des annotateurs vue de la myriadisation (crowdsourcing)Karën FortExperts or (crowd of) non-experts ? the question of the annotators’ expertise viewed from crowdsourcing.Manual corpus annotation is more and more performed using crowdsourcing : produced by a crowd of persons, through the Web, for free, or for a very small remuneration. Our experiments question the commonly accepted vision : crowdsourcing a task is not having a crowd of non-experts performing it, but rather identifying experts of the (annotation) task in the crowd. Those experiments therefore contribute to the reflection on the corpus annotators’ expertise.https://journals.openedition.org/corela/4835manual corpus annotationexpertscrowdsourcing |
| spellingShingle | Karën Fort Experts ou (foule de) non-experts ? la question de l’expertise des annotateurs vue de la myriadisation (crowdsourcing) Corela manual corpus annotation experts crowdsourcing |
| title | Experts ou (foule de) non-experts ? la question de l’expertise des annotateurs vue de la myriadisation (crowdsourcing) |
| title_full | Experts ou (foule de) non-experts ? la question de l’expertise des annotateurs vue de la myriadisation (crowdsourcing) |
| title_fullStr | Experts ou (foule de) non-experts ? la question de l’expertise des annotateurs vue de la myriadisation (crowdsourcing) |
| title_full_unstemmed | Experts ou (foule de) non-experts ? la question de l’expertise des annotateurs vue de la myriadisation (crowdsourcing) |
| title_short | Experts ou (foule de) non-experts ? la question de l’expertise des annotateurs vue de la myriadisation (crowdsourcing) |
| title_sort | experts ou foule de non experts la question de l expertise des annotateurs vue de la myriadisation crowdsourcing |
| topic | manual corpus annotation experts crowdsourcing |
| url | https://journals.openedition.org/corela/4835 |
| work_keys_str_mv | AT karenfort expertsoufouledenonexpertslaquestiondelexpertisedesannotateursvuedelamyriadisationcrowdsourcing |