Back to glossary

Embedding Models

Embedding models are machine learning models designed to represent data—such as words, sentences, images, or other entities—into vector embeddings, which are dense, numerical representations in a high-dimensional space. These models learn to capture semantic relationships and patterns within the data, allowing similar entities to be represented by vectors that are close to each other in the embedding space. Common embedding models include Word2Vec, BERT, and image embedding models used in tasks like similarity search and classification.

Related terms: vector embeddings; similarity search

On the Onehouse website: