Measuring the Correctness of Double-Keying: Error Classification and Quality Control in a Large Corpus of TEI-Annotated Historical Text
Among mass digitization methods, double-keying is considered to be the one with the lowest error rate. This method requires two independent transcriptions of a text by two different operators. It is particularly well suited to historical texts, which often exhibit deficiencies like poor master copie...
Saved in:
Main Authors: | Susanne Haaf, Frank Wiegand, Alexander Geyken |
---|---|
Format: | Article |
Language: | deu |
Published: |
Text Encoding Initiative Consortium
2015-03-01
|
Series: | Journal of the Text Encoding Initiative |
Subjects: | |
Online Access: | https://journals.openedition.org/jtei/739 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
The DTA “Base Format”: A TEI Subset for the Compilation of a Large Reference Corpus of Printed Text from Multiple Sources
by: Susanne Haaf, et al.
Published: (2015-04-01) -
Effect of synchronization system errors on the reception noise immunity of amplitude-phase shift keyed signals
by: G. V. Kulikov, et al.
Published: (2023-06-01) -
SYNTACTICAL ERROR ANALYSIS ON REPORT TEXT
by: Soraya Grabiella Dinamika, et al.
Published: (2019-08-01) -
Texts and Documents: New Challenges for TEI Interchange and Lessons from the Shelley-Godwin Archive
by: Trevor Muñoz, et al.
Published: (2015-09-01) -
The efficiency of tracking gold exchange-traded funds in the Tehran Stock Exchange: analysis of tracking error and double beta model
by: Meysam Kaviani, et al.
Published: (2024-12-01)