Last updated: April 5, 2026 · Model Architecture · by Daniel Ashford
What is Embeddings?
Numerical representations of text that capture semantic meaning — used in search and RAG systems.
Definition
Embeddings are dense numerical vectors that represent text in a way that captures semantic meaning. Texts with similar meanings have embeddings that are close together in vector space, while dissimilar texts are far apart.
How It Works
Modern embedding models produce vectors with 768 to 3,072 dimensions. They are the foundation of semantic search and RAG pipelines. Major embedding models include OpenAI text-embedding-3, Cohere embed-v4, and open-source models like BGE and E5.
Example
If you embed "How do I reset my password?" and "I forgot my login credentials," these would have very similar embedding vectors despite using different words.
Related Terms
See How Models Compare
Understanding embeddings is important when choosing the right AI model. See how 12 models compare on our leaderboard.