Token

Smallest meaningful text unit for AI processing

ModelsTechnical
Updated 2 May 2025·Reviewed

Definition

The smallest meaningful unit of text processed by an AI language model, such as a word, sub-word, or punctuation mark.

All TermsBack to GlossaryNext TermTokenisation