Last updated: April 5, 2026 · Model Architecture · by Daniel Ashford
What is Parameters?
The numerical weights inside an LLM that encode its learned knowledge.
Definition
Parameters are the numerical values within a neural network that are learned during training. In a language model, parameters encode the statistical patterns of language. The number of parameters is often used as a rough proxy for model capability.
How It Works
GPT-3 had 175 billion parameters. Modern frontier models have hundreds of billions to over a trillion. More parameters generally allow more knowledge storage and sophisticated reasoning, but training data quality and architecture choices matter as much as raw parameter count.
Example
Llama 4 405B has 405 billion parameters. A 7B model can run on a single consumer GPU, while 405B requires a cluster of high-end GPUs.
Related Terms
See How Models Compare
Understanding parameters is important when choosing the right AI model. See how 12 models compare on our leaderboard.