We Compare AI

Token

Core Concepts
Simple Definition

The basic unit of text that AI language models process — roughly equivalent to 3/4 of a word in English.

Full Explanation

Tokenization breaks text into smaller pieces (tokens) before an LLM processes it. One token is approximately 4 characters or 0.75 words in English. '1 million tokens' is roughly 750,000 words or about 1,500 pages. API pricing is measured per token — knowing token counts helps estimate costs.

Example

The sentence 'Hello, how are you?' is approximately 6 tokens.

Last verified: 2026-03-30← Back to Glossary