Skip to main content

AI Glossary

Quick, friendly definitions of AI terms from A to Z.

A

B

C

E

F

G

H

I

L

M

O

P

Q

R

S

T

Temperature

aka: Sampling Temperature, Randomness

A setting that controls how creative or random AI outputs are. Low = predictable and focused. High = creative and varied.

Token

aka: Tokens, Tokenization

A chunk of text (usually a word or part of a word) that AI processes. 'Chatbot' might be one token or split into 'chat' and 'bot'.

Tokenizer

aka: Tokenization, Token Encoding

A tool that breaks text into smaller pieces (tokens) that an AI model can process. Different models use different tokenizers, affecting how they count and understand text.

Tool (Function Calling)

aka: Function, Function Calling, Tool Use

A capability that allows an AI to call external functions or APIs—like searching the web, querying databases, or running calculations.

Top-p (Nucleus Sampling)

aka: Nucleus Sampling, Top-k, Sampling Strategy

A parameter that controls randomness in AI text generation by choosing from the smallest set of words whose probabilities add up to p%. Lower values (0.1-0.5) make output more focused; higher values (0.9-1.0) make it more creative.

Training

aka: Model Training, AI Training

The process of feeding data to an AI system so it learns patterns and improves its predictions over time.

Training Data

aka: Training Set, Training Dataset, Training Corpus

The collection of examples an AI system learns from. The quality, quantity, and diversity of training data directly determines what the AI can and cannot do.

Transformer

aka: Transformer Architecture, Transformer Model

A neural network architecture that revolutionized AI by using attention mechanisms to understand relationships between words, enabling modern LLMs.

V

Z