Vector embeddings convert tokens (or words, sentences, chunks, documents, graph nodes, and concepts) to a high-dimensional space that encodes the semantic meaning, such that similar tokens are near each other.
Options include
- Word2Vec (2013)
- BERT (2018)
- OpenAI Embeddings (2024)
[[vector database]]
[[Chroma]]
[[FAISS]]
[[t-stochastic neighborhood embeddings]]