Parameters
Also known as: Model Parameters, Weights
In one sentence
The internal numerical values within an AI model that are adjusted during training to capture patterns in data. More parameters generally mean a more capable model, but also higher costs and slower inference.
Explain like I'm 12
Think of parameters like millions of tiny knobs on a mixing board. During training, the AI tweaks each knob bit by bit until the output sounds right. More knobs mean finer control—but also a bigger, more expensive machine.
In context
Parameter count is one of the most common ways to compare AI models. GPT-3 has 175 billion parameters, while LLaMA 2 ranges from 7 billion to 70 billion parameters. Larger models generally handle more complex tasks but require more computing power and memory to run. This is why companies offer different model sizes at different price points—a 7B parameter model can run on a single GPU, while a 70B model may need a cluster. The trend toward smaller, more efficient models shows that smart training can sometimes compensate for fewer parameters.
See also
Related Guides
Learn more about Parameters in these guides:
Structured Output and Function Calling: Getting Reliable JSON from AI
IntermediateLearn how to get reliable, parseable JSON output from AI models using structured output, function calling, and JSON schema. Essential for production AI applications.
15 min readTemperature and Sampling: Controlling AI Creativity
IntermediateTemperature, top-p, and other sampling parameters control how creative or deterministic AI outputs are. Learn how to tune them.
6 min readModel Compression: Smaller, Faster AI
AdvancedCompress AI models with quantization, pruning, and distillation. Deploy faster, cheaper models without sacrificing much accuracy.
9 min read