Last updated: April 5, 2026 · Prompting & Usage · by Daniel Ashford

What is Temperature?

QUICK ANSWER

A setting that controls how creative or deterministic responses are.

Definition

Temperature controls the randomness of output. Lower temperatures (0.0-0.3) make the model more deterministic. Higher temperatures (0.7-1.0+) increase randomness and creativity.

How It Works

At temperature 0, the model always picks the most likely token — ideal for factual tasks and code. At temperature 1.0, it samples from the full distribution — better for creative writing and brainstorming. Most APIs default to 0.7.

Example

Asking "What is the capital of France?" at temperature 0 always produces "Paris." At temperature 1.5, it might produce creative tangents.

Related Terms

Prompt
The text input you send to an LLM to get a response.
Inference
The process of an LLM generating a response to your input.

See How Models Compare

Understanding temperature is important when choosing the right AI model. See how 12 models compare on our leaderboard.

View Leaderboard →Our Methodology
← Browse all 47 glossary terms
DA
Daniel Ashford
Founder & Lead Evaluator · 200+ models evaluated