Last updated: April 5, 2026 · Model Architecture · by Daniel Ashford

What is Parameters?

QUICK ANSWER

The numerical weights inside an LLM that encode its learned knowledge.

Definition

Parameters are the numerical values within a neural network that are learned during training. In a language model, parameters encode the statistical patterns of language. The number of parameters is often used as a rough proxy for model capability.

How It Works

GPT-3 had 175 billion parameters. Modern frontier models have hundreds of billions to over a trillion. More parameters generally allow more knowledge storage and sophisticated reasoning, but training data quality and architecture choices matter as much as raw parameter count.

Example

Llama 4 405B has 405 billion parameters. A 7B model can run on a single consumer GPU, while 405B requires a cluster of high-end GPUs.

Related Terms

Transformer
The neural network architecture behind all modern LLMs.
Large Language Model (LLM)
An AI system trained on massive text data to understand and generate human language.
Quantization
Compressing an LLM to use less memory by reducing numerical precision.
Fine-Tuning
Customizing a pre-trained LLM on your specific data to improve performance for your use case.

See How Models Compare

Understanding parameters is important when choosing the right AI model. See how 12 models compare on our leaderboard.

View Leaderboard →Our Methodology
← Browse all 47 glossary terms
DA
Daniel Ashford
Founder & Lead Evaluator · 200+ models evaluated