Tokenizers (original) (raw)

Hugging Face's logo

Join the Hugging Face community

and get access to the augmented documentation experience

Collaborate on models, datasets and Spaces

Faster examples with accelerated inference

Switch between documentation themes

Tokenizers

Fast State-of-the-art tokenizers, optimized for both research and production

🤗 Tokenizers provides an implementation of today’s most used tokenizers, with a focus on performance and versatility. These tokenizers are also used in 🤗 Transformers.

Main features:

< > Update on GitHub