Embedding
Also known as: Vector, Vector Representation, Embeddings
In one sentence
A list of numbers that represents the meaning of text, images, or other data. Similar meanings produce similar numbers, so computers can measure how 'close' two concepts are.
Explain like I'm 12
Turn words into coordinates on a map. Words with similar meanings sit close together, so 'cat' is near 'kitten' but far from 'database.' The computer finds related things by checking which points are nearby on the map.
In context
Embeddings are the backbone of modern AI search and retrieval systems. When you use semantic search (like finding documents by meaning rather than exact keywords), embeddings power it. In RAG systems, your documents are converted into embeddings and stored in a vector database. When you ask a question, your query is also converted into an embedding, and the system finds documents with the closest matching embeddings. OpenAI, Google, and Cohere all offer embedding APIs that convert text into numerical vectors.
See also
Related Guides
Learn more about Embedding in these guides:
Embeddings: Turning Words into Math
IntermediateEmbeddings convert text into numbers that capture meaning. Essential for search, recommendations, and RAG systems.
9 min readVector Database Examples: Real-World Use Cases and Code
IntermediatePractical examples of vector databases in action: semantic search, chatbot memory, recommendation systems, and more with code snippets.
9 min readEmbeddings & RAG Explained (Plain English)
IntermediateHow AI tools search and retrieve information from documents. Understand embeddings and Retrieval-Augmented Generation without the math.
11 min read