Cosine similarity is the most popular [[similarity measure]] for [[word embeddings]]. Defined simply as the [[dot product]] of two vectors normalized by their lengths (to avoid bias towards longer vectors), the more similar each dimension $i$ the larger the product. As it turns out, the normalized dot product is equivalent to the cosine of the angle of the two vectors.
$\text {cosine}(v,w) = \frac{v \cdot w}{|v||w|} = \frac{\sum v_i w_i}{\sqrt{\sum v_i^2} \sqrt{\sum w_i^2}}$
For unit vectors, cosine similarity is equal to the [[dot product]] of the two vectors.