The Cauchy-Schwarz Inequality

This page will be about the Cauchy-Schwarz inequality. It is assumed that the reader is familiar with vector norms, and dot products from linear algebra.


Table Of Contents


The History Of The Cauchy-Schwarz Inequality

The Cauchy-Schwarz inequality was named in honour of the French mathematician Augustin Cauchy and the German mathematician Hermann Schwarz. In this context, we are dealing with vectors but there are other variations of this inequality.

This inequality is sometimes called Cauchy’s inequality, the Schwarz inequality or the Bunyakovsky inequality. Bunyakovsky was a Russian mathematician who had his version of this equality 25 years before Schwarz.


Review Of The Dot Product

The dot product or the Euclidean inner product is an algebraic operation which takes two vectors of the same length and returns a scalar number.

Given the vectors \textbf{u} = (u_1, u_2, \dots, u_{n}) and \textbf{v} = (v_1, v_2, \dots, v_{n}) in \mathbb{R}^{n},

the dot product of \textbf{u} and \textbf{v} denoted by \textbf{u} \cdot \textbf{v} is:

    \[\textbf{u} \cdot \textbf{v} = \sum_{j=1}^{n} u_j v_j = u_1 v_1 + u_2v_2 + \dots + u_n v_n\]

If we have the special case of \textbf{u} \cdot \textbf{u} then we obtain the square of the norm as follows:

 

Source: http://quicklatex.com/cache3/4a/ql_42fa0072616307bed9f67516d74ecb4a_l3.png

 

 

From the above result, we can identify that the norm or the length of a vector \textbf{u} can be expressed as the square root of the dot product (below).

    \[||\textbf{u}||= \sqrt{\textbf{u} \cdot \textbf{u}}\]

The dot product has an alternate formula for vectors \textbf{u} and \textbf{v} in \mathbb{R}^{n}. We have:

    \[\textbf{u} \cdot \textbf{v} = ||\textbf{u}|| ||\textbf{v}|| \cos(\theta)\]


Arriving To The Cauchy-Schwarz Inequality

The dot product along with norms can help us find the cosine of an angle \theta. A useful formula is:

    \[\cos(\theta) = \dfrac{\textbf{u} \cdot \textbf{v}}{||\textbf{u}|| ||\textbf{v}||}\]

where \textbf{u}, \textbf{v} are vectors in \mathbb{R}^{n} and \theta is the angle between \textbf{u} and \textbf{v}. The angle \theta is defined to be between 0 and \pi (0 \leq \theta \leq \pi).

Recall from pre-calculus class that the cosine of an angle has a minimum of -1 and a maximum of +1. We can find the angle \theta by taking the cosine inverse of \dfrac{\textbf{u} \cdot \textbf{v}}{||\textbf{u}|| ||\textbf{v}||}.

    \[\theta = \cos^{-1}\Bigg(\dfrac{\textbf{u} \cdot \textbf{v}}{||\textbf{u}|| ||\textbf{v}||}\Bigg)\]

From the above, we can note that we have the inequality:

    \[-1 \leq \dfrac{\textbf{u} \cdot \textbf{v}}{||\textbf{u}|| ||\textbf{v}||} \leq 1\]

Multiplying all the terms in the above inequality by ||\textbf{u}|| ||\textbf{v}|| gives:

    \[-||\textbf{u}|| ||\textbf{v}|| \leq \textbf{u} \cdot \textbf{v} \leq ||\textbf{u}|| ||\textbf{v}||\]

We use the middle term and the one on the right as the focus for the Cauchy-Schwarz Inequality.


The Cauchy-Schwarz Inequality

Suppose there are two vectors \textbf{u} = (u_{1}, u_{2}, \dots, u_{n}) and \textbf{v} = (v_{1}, v_{2}, \dots, v_{n}) in \mathbb{R}^{n}, we have:

    \[| \textbf{u} \cdot \textbf{v}| \leq ||\textbf{u}|| ||\textbf{v}||\]

In terms of components, the above can be expressed as:

    \[| u_{1}v_{1} + u_{2}v_{2} + \dots + u_{n}v_{n}| \leq (u_{1}^{2} + u_{2}^{2} + \dots + u_{n}^{2})^{1/2}(v_{1}^{2} + v_{2}^{2} + \dots + v_{n}^{2})^{1/2}\]

Recall that \sqrt{x} = (x)^{1/2}.


Applications

I do not know from the top of my head of the applications of this inequality. However, I would suspect that the Cauchy-Schwartz inequality does have useful mathematical applications in (higher-level) linear algebra, probability theory, real analysis, topology and vector algebra.