Skip to main content
BETAThis is a new design — give feedback

AI Glossary

AI conversations are full of terms that sound intimidating but are simpler than they seem. This glossary breaks down every concept in plain English.

Each entry includes a one-line definition, an “explain like I’m 12” analogy, real-world context, and links to related terms and deeper guides.

41 terms

A

B

C

E

F

G

H

I

L

M

O

P

Q

R

S

T

Temperature

aka: Sampling Temperature, Randomness, Creativity Setting

A setting that controls how creative or random an AI's responses are. Low temperature produces predictable, focused answers. High temperature produces varied, more creative outputs.

Token

aka: Tokens, Tokenization

A chunk of text — usually a word or part of a word — that AI models process as a single unit. Most English words are one token, but longer or uncommon words get split into pieces.

Tokenizer

aka: Tokenisation, Token Encoding

A tool that breaks text into smaller pieces (tokens) that an AI model can process. Different models use different tokenizers, affecting how they count and understand text.

Tool (Function Calling)

aka: Function, Function Calling, Tool Use

A capability that allows an AI model to call external functions or APIs — like searching the web, querying databases, or running calculations — instead of relying solely on its training data.

Top-p (Nucleus Sampling)

aka: Nucleus Sampling, Top-p Sampling

A parameter that controls randomness in AI text generation by choosing from the smallest set of words whose combined probability reaches a threshold p. Lower values make output more focused; higher values make it more creative.

Training

aka: Model Training, AI Training

The process of feeding large amounts of data to an AI system so it learns patterns, relationships, and rules, enabling it to make predictions or generate output.

Training Data

aka: Training Set, Training Dataset, Training Corpus

The collection of examples an AI system learns from. The quality, quantity, and diversity of training data directly determines what the AI can and cannot do.

Transformer

aka: Transformer Architecture, Transformer Model

A neural network architecture that revolutionised AI by using attention mechanisms to understand relationships between all words in a text simultaneously, enabling modern LLMs like GPT and Claude.

V

Z