Skip to main content
BETAThis is a new design — give feedback

Parameters

Also known as: Model Parameters, Weights

In one sentence

The internal numerical values within an AI model that are adjusted during training to capture patterns in data. More parameters generally mean a more capable model, but also higher costs and slower inference.

Explain like I'm 12

Think of parameters like millions of tiny knobs on a mixing board. During training, the AI tweaks each knob bit by bit until the output sounds right. More knobs mean finer control—but also a bigger, more expensive machine.

In context

Parameter count is one of the most common ways to compare AI models. GPT-3 has 175 billion parameters, while LLaMA 2 ranges from 7 billion to 70 billion parameters. Larger models generally handle more complex tasks but require more computing power and memory to run. This is why companies offer different model sizes at different price points—a 7B parameter model can run on a single GPU, while a 70B model may need a cluster. The trend toward smaller, more efficient models shows that smart training can sometimes compensate for fewer parameters.

See also

Related Guides

Learn more about Parameters in these guides: