Skip to main content
BETAThis is a new design — give feedback

Top-p (Nucleus Sampling)

Also known as: Nucleus Sampling, Top-p Sampling

In one sentence

A parameter that controls randomness in AI text generation by choosing from the smallest set of words whose combined probability reaches a threshold p. Lower values make output more focused; higher values make it more creative.

Explain like I'm 12

Imagine the AI has a ranked list of possible next words. Top-p says 'only pick from the most likely words that together add up to 90% chance' (if p is 0.9). This stops the AI from choosing really weird or random words.

In context

Top-p works alongside temperature to control how creative or predictable AI output is. For factual tasks like summarisation or data extraction, setting top-p to 0.1-0.3 keeps responses focused and consistent. For creative writing or brainstorming, top-p of 0.8-0.95 allows more variety. Most API providers (OpenAI, Anthropic, Google) expose top-p as a parameter. A related setting, top-k, works similarly but picks from a fixed number of top candidates (e.g., top-k=40) rather than a probability threshold.

See also

Related Guides

Learn more about Top-p (Nucleus Sampling) in these guides: