Comparing Vectors

In this book, we'll use the terms distance, score, and similarity. For any -dimensional vectors and :
  • A distance measure such as Euclidean distance requires smaller values to indicate "more similar". We assume for any distance measure, if and only if , , and .

  • A similarity measure such as cosine similarity requires larger values to indicate "more similar", but no upper bound is required. Often, similarity measures can be converted to distance measures, and vice versa. For example, since cosine similarity is bounded by -1 and 1, we can define cosine distance as .

  • A score is an arbitrary function where a larger score indicates a "better" match. All similarity measures are scores, and all negated distance metrics are scores. This terminology was invented to avoid misunderstandings that may arise when comparing, say, Euclidean distance with cosine similarity.

The Euclidean distance

In machine learning, we frequently must compare how far one vector is from another. There are many ways of doing this. Among the simplest (and most common) is the Euclidean distance. This is the "straight line" distance we're familiar with in the real world.

Its definition is based on the Pythagorean Theorem. Given a right-angled triangle with side lengths and hypotenuse length , then . We can use this to compute the distance ("hypotenuse") between two two-dimensional points.

Suppose we want to know the "straight-line" distance between two points and . We can connect the points with a right triangle with sides and . Then, by the Pythagorean Theorem, the "straight-line" distance between them (the hypotenuse) is .

Applying the Pythagorean Theorem a second time, we can derive the formula for three-dimensional vectors .

Noticing a pattern, we can define the general -dimensional Euclidean distance as follows:

Definition. Given a -dimensional vector , the magnitude of is

Definition. Given -dimensional vectors and , the Euclidean distance is

We will see that some distance measures (including Euclidean distance) are referred to as metrics:

Definition. A distance metric satisfies the following properties. For any -dimensional vectors , , and :

  • ;
  • Positivity. ;
  • Symmetry. ;
  • Triangle inequality.

The triangle inequality ensures that the shortest distance between two points is a straight line.

Dot product

Definition. Given two -dimensional vectors and , the dot product is their elementwise product: .

In machine learning, the dot product often is used as a similarity measure. However, as we'll see below it is somewhat imperfect. It is affected both by the angle between the vectors and their magnitudes.

Definition. Given a -dimensional vector , the unit vector of is . This vector is in the direction of but has magnitude , on the unit circle.

In many textbooks, the dot product is defined directly in terms of the angle between two vectors. Where are vectors, are their magnitudes, and is the angle between the vectors:

Hence, if the vectors are unit vectors, the dot product by itself is the cosine of the angle between them.

Cosine similarity

The cosine similarity is the cosine of the angle between two vectors. It is useful when direction is more important than magnitude. For example, someone who always rates movies is similar to someone who always rates them (perhaps the origin represents all movies rated ) -- their ratings convey no information about the movies!

The cosine similarity is bounded between [-1, 1], with a larger score indicating more similar.

  • s = 1. Same direction, since: , e.g. .
  • s = -1. Opposite direction, since: , e.g. .
  • s = 0. Orthogonal (e.g. a 90-degree angle), since: , e.g. .

This measure can be efficiently computed, since it is based on the dot product.

Hence, the dot product by itself is effectively an "un-normalized" cosine similarity. This means it can be affected by magnitude!

This is especially apparent in modern LLMs if the embeddings are unembedded by themselves. Using cosine similarity, each token will match best with itself. However, merely using the dot product will cause many tokens to match with other tokens with larger magnitude.

Exercises

  1. By applying the Pythagorean Theorem twice, derive the three-dimensional Euclidean distance formula. Prove that for any three-dimensional vectors and ,