ByT5: Towards a Token-Free Future with Pre-trained Byte-to-Byte Models

Saved in:
Bibliographic Details
Main Authors: Linting Xue, Aditya Barua, Noah Constant, Rami Al-Rfou, Sharan Narang, Mihir Kale, Adam Roberts, Colin Raffel
Format: Article
Language:English
Published: The MIT Press 2022-03-01
Series:Transactions of the Association for Computational Linguistics
Online Access:http://dx.doi.org/10.1162/tacl_a_00461
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850264801857503232
author Linting Xue
Aditya Barua
Noah Constant
Rami Al-Rfou
Sharan Narang
Mihir Kale
Adam Roberts
Colin Raffel
author_facet Linting Xue
Aditya Barua
Noah Constant
Rami Al-Rfou
Sharan Narang
Mihir Kale
Adam Roberts
Colin Raffel
author_sort Linting Xue
collection DOAJ
format Article
id doaj-art-d0e9a4ee33c94f9e86ac4743c2d0d92f
institution OA Journals
issn 2307-387X
language English
publishDate 2022-03-01
publisher The MIT Press
record_format Article
series Transactions of the Association for Computational Linguistics
spelling doaj-art-d0e9a4ee33c94f9e86ac4743c2d0d92f2025-08-20T01:54:37ZengThe MIT PressTransactions of the Association for Computational Linguistics2307-387X2022-03-011010.1162/tacl_a_00461ByT5: Towards a Token-Free Future with Pre-trained Byte-to-Byte ModelsLinting XueAditya BaruaNoah ConstantRami Al-RfouSharan NarangMihir KaleAdam RobertsColin Raffelhttp://dx.doi.org/10.1162/tacl_a_00461
spellingShingle Linting Xue
Aditya Barua
Noah Constant
Rami Al-Rfou
Sharan Narang
Mihir Kale
Adam Roberts
Colin Raffel
ByT5: Towards a Token-Free Future with Pre-trained Byte-to-Byte Models
Transactions of the Association for Computational Linguistics
title ByT5: Towards a Token-Free Future with Pre-trained Byte-to-Byte Models
title_full ByT5: Towards a Token-Free Future with Pre-trained Byte-to-Byte Models
title_fullStr ByT5: Towards a Token-Free Future with Pre-trained Byte-to-Byte Models
title_full_unstemmed ByT5: Towards a Token-Free Future with Pre-trained Byte-to-Byte Models
title_short ByT5: Towards a Token-Free Future with Pre-trained Byte-to-Byte Models
title_sort byt5 towards a token free future with pre trained byte to byte models
url http://dx.doi.org/10.1162/tacl_a_00461
work_keys_str_mv AT lintingxue byt5towardsatokenfreefuturewithpretrainedbytetobytemodels
AT adityabarua byt5towardsatokenfreefuturewithpretrainedbytetobytemodels
AT noahconstant byt5towardsatokenfreefuturewithpretrainedbytetobytemodels
AT ramialrfou byt5towardsatokenfreefuturewithpretrainedbytetobytemodels
AT sharannarang byt5towardsatokenfreefuturewithpretrainedbytetobytemodels
AT mihirkale byt5towardsatokenfreefuturewithpretrainedbytetobytemodels
AT adamroberts byt5towardsatokenfreefuturewithpretrainedbytetobytemodels
AT colinraffel byt5towardsatokenfreefuturewithpretrainedbytetobytemodels