Arithmetic Mean – Geometric Mean | Brilliant Math & Science Wiki

AM-GM inequality can be proved by several methods. Some of them are listed here.

The first one in the list is to prove by some sort of induction. Here we go:

At first, we let the inequality for \(n\) variables be asserted by \(P(n)\). Traditional inductions check a base case and then show that \(P(n)\implies P(n+1)\). But we’ll show these:

  • \(P(2)\) holds.
  • \(P(n)\implies P(2n).\)
  • \(P(n)\implies P(n-1).\)

Why should this work? Note that we jump from \(n\) to \(2n\) by showing \(P(n)\implies P(2n)\). Then using \(P(n)\implies P(n-1)\), we can induct backwards from \(2n\) to \(n+1,\) to verify that all numbers between \(n\) and \(2n\) (inclusive) satisfy the assertion. This is known as forward-backward induction.

Now we move on to prove those points. It’s already shown above that \(P(2)\) holds, and now we prove \(P(n)\implies P(2n)\). Consider positive reals \(a_1,a_2,…,a_{2n}\). Since we assume that \(P(n)\) is true, for any \(n\) positive reals \(\frac{ \sum _{ i=1 }^{ n }{ { a }_{ i } }}{n} \ge \sqrt [ n ]{ \prod _{ i=1 }^{ n }{ { a }_{ i } } } \) holds. Then

\[\begin{array}{rcl}
\dfrac{1}{2n}\displaystyle\sum_{i=1}^{2n} a_i &=& \dfrac{1}{2n}\left(\displaystyle\sum_{i=1}^n a_i +\displaystyle \sum_{i=n+1}^{2n} a_i\right) \\ &\stackrel{n\text{ AM-GM}}{\ge} &\dfrac{1}{2}\left(\sqrt[n]{\prod_{i=1}^n a_i} + \sqrt[n]{\prod_{i=n+1}^{2n} a_i}\right) \\ &\stackrel{2\text{ AM-GM}}{\ge} &\sqrt{\sqrt[n]{\prod_{i=1}^n a_i} \times \sqrt[n]{\prod_{i=n+1}^{2n} a_i}} \\ &=& \sqrt[2n]{\prod_{i=1}^{2n} a_i},
\end{array}\]

where \(n\text{ AM-GM}\) means AM-GM inequality applied on \(n\) variables. We’ve also used the base case, with \(2\) variables. So the second part of our proof is also complete. It remains to show that \(P(n)\implies P(n-1)\). To show this, we take \(n\) positive reals \(a_1,a_2,…,a_{n-1}\) and \( a_n=\frac{1}{n-1} \sum_{i=1}^{n-1} a_i\). Then notice that
\[\dfrac{1}{n}\sum_{i=1}^n a_i =\dfrac{1}{n}\left(\sum_{i=1}^{n-1} a_i + \dfrac{1}{n-1}\sum_{i=1}^{n-1} a_i\right)=\dfrac{1}{n-1}\sum_{i=1}^{n-1} a_i.\]

So we have

\[\begin{array}{rcl}
\dfrac{1}{n-1}\displaystyle\sum_{i=1}^{n-1} a_i &=& \dfrac{1}{n}\sum_{i=1}^n a_i \\
&\ge &\sqrt[n]{\prod_{i=1}^n a_i} \\
&=& \sqrt[n]{\dfrac{1}{n-1}\left(\sum_{i=1}^{n-1} a_i\right) \left(\prod_{i=1}^{n-1} a_i\right)} \\
\Rightarrow \left(\dfrac{1}{n-1}\sum_{i=1}^{n-1}a_i \right)^n &\ge & \dfrac{1}{n-1}\left(\sum_{i=1}^{n-1} a_i\right) \left(\prod_{i=1}^{n-1} a_i\right) \\
\Rightarrow \left(\dfrac{1}{n-1}\sum_{i=1}^{n-1} a_i \right)^{n-1} &\ge & \prod_{i=1}^{n-1} a_i \\
\Rightarrow \dfrac{1}{n-1}\sum_{i=1}^{n-1} a_i &\ge & \sqrt[n-1]{\prod_{i=1}^{n-1} a_i}.
\end{array}\]

As you must’ve noticed, the first inequality here is \(n\) variable AM-GM. So the third part of the proof is complete, and the induction as well. Clearly the equalities, inductively, hold if and only if \(a_1=a_2=\cdots =a_n\). \(_\square\)

We see that the AM-GM inequality is one of the special cases of Jensen’s inequality. So let’s approach it that way.

Consider the function \(f(x) = \log x \ \forall x > 0.\)

We observe that \(f^{\prime\prime}(x) = \frac{-1}{x^2} < 0\). Therefore, \(f(x)\) turns out to be a concave function. \(\big(\)Also notice that we can conclude \(f(x)\) is concave using the graph of \(\log x.\big)\) Thus, by Jensen’s inequality, we have

\[\begin{align}
f\left(\dfrac{\sum_{i=1}^n a_i}{n}\right) & \geq \dfrac{\sum_{i=1}^n f\left(a_i\right)}{n} \\
\Rightarrow \log \left(\dfrac{\sum_{i=1}^n a_i}{n}\right) & \geq \dfrac{\sum_{i=1}^n \log \left(a_i\right)}{n}.
\end{align}\]

By the property of logarithm, we have \(\log x + \log y = \log xy\) and \(b\log a = \log a^b\). Therefore, we can simplify the terms on the RHS as

\[\begin{align}
\log \left(\dfrac{\sum_{i=1}^n a_i}{n}\right) & \geq \dfrac{\sum_{i=1}^n \log a_i}{n} \\
& = \dfrac{\log a_1+\log a_2+ \cdots + \log a_n}{n} \\
& = \dfrac{\log \left(a_1 a_2 a_3 \ldots a_n\right)}{n} \\
& = \log \left(\prod_{i=1}^n a_i\right)^{1/n}.
\end{align}\]

Thus,

\[\dfrac{ \sum_{i=1}^n a_i}{n} \geq \sqrt[n]{\prod_{i=1}^n a_i}.\ _\square\]

Another proof, made famous by a mathematician George Pólya, does not rely on induction. This shows us a more general version of the AM-GM inequality.

Suppose that for a sequence of positive reals \(a_k\) with \(1\leq k\leq n\) and a sequence of positive reals \(p_k\) with \(1\leq k \leq n\) such that \( \sum_{k=1}^{n} p_k = 1\),
\[a_1^{p_1}a_2^{p_2}a_3^{p_3}\cdots a_n^{p_n}\leq a_1p_1+a_2p_2+a_3p_3+\cdots+a_np_n.\]
Pólya’s proof begins with the observation that \(1+x\leq e^x,\) which can easily be verified graphically with equality occurring only at \(x=0\). This intuition allegedy came to Pólya in a dream, where he claimed it was “the best mathematics he had ever dreamt.” If we make the change of variables \(x\mapsto x-1\), our initial observation becomes \(x\leq e^{x-1}\). Applying this to our sequence \(a_k,\) we get
\[\begin{align}
a_k &\leq e^{a_k -1}\\
a_k^{p_k} &\leq e^{a_kp_k – p_k}.
\end{align}\]
Now, we can see that
\[\begin{align}
G=\prod_{k=1}^{n} a_k^{p_k}
&\leq \exp \left(\sum_{k=1}^{n} a_kp_k – \sum_{k=1}^{n} p_k\right)\\
&\leq \exp \left(\sum_{k=1}^{n} a_kp_k – 1\right).
\end{align}\]
However, we can also see from the latter of our initial observations that
\[A=\sum_{k=1}^{n} a_kp_k \leq \exp \left(\sum_{k=1}^{n} a_kp_k -1\right),\]
which is the same bound we just found for \(G\). So we could say
\[A, G \leq \exp \left (\sum_{k=1}^{n} a_kp_k -1 \right ).\]
So we have related \(A\) and \(G\) by inequality, but we have not separated them. Now we look closer at the case where \(A\) and \(G\) are equal to the expression on the right. The idea of “normalization” then comes to mind. In other words, we can somewhat manipulate the sequence \(a_k\) to our advantage.
Define a new sequence \(\alpha_k\) with \(1\leq k \leq n,\) where we have
\[\begin{align}
\alpha_k &= \frac{a_k}{A}\\
A&=a_1p_1+a_2p_2+a_3p_3+\cdots+a_np_n.
\end{align}\]
Applying our earlier bound for \(G\) for our new variables \(\alpha_k\), we get
\[\begin{align}
\prod_{k=1}^{n} \alpha_k^{p_k} &\leq \exp \left(\sum_{k=1}^{n} \alpha_kp_k-1\right)\\
\prod_{k=1}^{n} \left(\frac{a_k}{A}\right)^{p_k} &\leq \exp \left(\sum_{k=1}^{n} \frac{a_k}{A}p_k-1\right).
\end{align}\]
Then we have
\[\sum_{k=1}^{n} \frac{a_k}{A}p_k = \frac{a_1p_1+a_2p_2+a_3p_3+\cdots+a_np_n}{A} = \frac{A}{A} = 1,\]
and it follows that
\[\exp \left(\sum_{k=1}^{n} \frac{a_k}{A}p_k-1\right) = 1.\]
So now, we are led to the fact that
\[\begin{align}
\prod_{k=1}^{n} \left(\frac{a_k}{A}\right)^{p_k} &\leq 1\\
\frac{\prod_{k=1}^{n} a_k^{p_k}}{\prod_{k=1}^{n} A^{p_k}} &\leq 1.
\end{align}\]
Since we have \(\displaystyle \sum_{k=1}^{n} p_k = 1,\) we can finally say
\[\begin{align}
\frac{G}{A} &\leq 1\\
G &\leq A\\
a_1^{p_1}a_2^{p_2}a_3^{p_3}…a_n^{p_n} &\leq a_1p_1+a_2p_2+a_3p_3+\cdots+a_np_n.\ _\square
\end{align}\]

The following proof is way more intuitive and requires a bit of combinatorics.

Let \(S\) be a set of positive real numbers. This set contains \(n\) terms, and the \(n^\text{th}\) power (which is a positive integer because \(n\) represents the number of elements on a set) of this summation can be expressed as follows:
\[(a_1+a_2+\cdots+a_n)^{n}=\underbrace{(a_1+a_2+\cdots+a_n) (a_1+a_2+\cdots+a_n) \cdots (a_1+a_2+\cdots+a_n)}_{n \text{ times}}.\]
We know that there exists a term which is the product of all the terms of \(S\). As we can take one term per bracket to combine with the other brackets, the combined term \(a_1 a_2 a_3 \ldots a_n\) appears \(n!\) times, because there are \(n\) ways to choose a term on the first bracket, \(n-1\) on the second one, and so on.

As every term is positive, any \(n^\text{th}\) power expansion is greater than its summand, and thus we get
\[\left(\sum_{i=1}^n a_i \right)^{n} >n! \left(\prod_{i=1}^n a_i\right).\]
Now, we only need to remember the definition of geometric mean, which is the \(n^\text{th}\) root of the product between all terms on a set followed by some algebraic manipulations. Then we achieve
\[\frac{ \sum_{i=1}^n a_i}{(n!)^{\frac{1}{n}}}=k> m_g,\]
where \(m_g\) is the geometric mean. Now we can prove by induction that for every positive integer \(n\), \(n^{n} \geq n!\), and this proof will help us out to eliminate the factorial on the denominator of the last inequality. For \(n=1\) this becomes \(1^{1} \geq 1!\), which is true. Then we assume the inequality is valid for \(n=k\) and apply the inequality for \(n=k+1\), and prove that must be true too. Then, there we go:
\[\begin{align}
(k+1)^{k+1}
&=k^{k+1}+(k+1)\cdot k^{k}\cdot 1^{1}+\frac{k\cdot (k+1)}{2}\cdot k^{k-1}\cdot 1^{2}+\cdots +1^{k+1}\\
&>k^{k+1}+(k+1)\cdot k^{k}\\
&>k^{k}\cdot (2k+1).
\end{align}\]
But from induction hypothesis, we have
\[\begin{align}
k^{k} &\geq k! \\
k^{k}.(2k+1) &\geq k!.(2k+1) \\
k^{k}.(2k+1) &\geq k!.[k+(k+1)]\\
k^{k}\cdot (2k+1) &\geq k\cdot k!+(k+1)! \\
&>(k+1)! \\
\Rightarrow k^{k}\cdot(2k+1)&>(k+1)!.
\end{align}\]

With \((k+1)^{k+1}>k^{k}\cdot (2k+1)\) and \(k^{k}\cdot (2k+1)>(k+1)!\), we get \((k+1)^{k+1} >(k+1)!\). But for \(k=0\) we already saw that there’s equality, and it becomes \((k+1)^{k+1} \geq (k+1)!\) as we wished to prove. Now, we can use the inequality \(n^{n} \geq n!\), which is valid for every positive integer. Some algebraic manipulations can be made as follows:
\[\begin{align}
n^{n} &\geq n! \\
\frac{1}{n^{n}} &\leq \frac{1}{n!} \\
\frac{1}{n} &\leq \frac{1}{(n!)^{\frac{1}{n}}} \\
\frac{ \sum_{i=1}^n a_i}{n} &\leq \frac{ \sum_{i=1}^n a_i}{(n!)^{\frac{1}{n}}} \\
m_a &\leq \frac{ \sum_{i=1}^n a_i}{(n!)^{\frac{1}{n}}}=k,
\end{align}\]
where \(m_a\) is the arithmetic mean. Now, a geometric interpretation on the real line makes the results easy to understand:
\[\begin{align}
k>m_g>0 &\implies m_g \in (0, k) \\
k\geq m_a>0 &\implies m_a \in (0, k].
\end{align}\]
When \(m_a\) equals its limit, \(m_g\) can only be smaller than \(m_g\). Otherwise, both equal each other. Hence, the inequality we were searching for:
\[m_g \leq m_a.\ _\square\]

Alternate Text Gọi ngay