Skip to content
Definition

Token

The basic unit of text that AI models process (roughly 4 characters).

Full Definition

A token is the basic unit of text that language models process. Tokens are typically word pieces—common words might be single tokens, while longer or uncommon words are split into multiple tokens. On average, one token equals roughly 4 characters or 0.75 words in English. Understanding tokens helps explain AI processing limits and costs. For brand monitoring, knowing that AI processes text as tokens helps explain why consistent brand names and terminology improve AI recognition.

Related Terms

Tools & Resources

Monitor Your AI Visibility

See how ChatGPT, Claude, and Perplexity mention your brand.

Free AI Visibility Check