Embeddings
TechniquesNumerical vector representations of text that capture semantic meaning, allowing AI to find conceptually similar content.
Full Explanation
When you convert text to embeddings, semantically similar sentences end up as nearby vectors in high-dimensional space. 'The cat sat on the mat' and 'A feline rested on the rug' would have similar embedding vectors. Embeddings power semantic search (finding relevant documents by meaning, not just keywords) and are the foundation of RAG systems and vector databases.
OpenAI's text-embedding-3-large model converts text into 3,072-dimensional vectors for semantic search.
Related Terms
A technique that enhances LLM responses by retrieving relevant documents from an external knowledge base before generating an answer.
A database optimized for storing and searching embedding vectors — the foundation of RAG and semantic search applications.
Search that finds results based on meaning and intent, not just keyword matching.