Look up tokenization or tokenisation in Wiktionary, the free dictionary.
Tokenization may refer to:
- Tokenization (lexical analysis) in language processing
- Tokenization in search engine indexing
- Tokenization (data security) in the field of data security
- Word segmentation
- Transformer (deep learning architecture)
See also
edit- Tokenism of minorities