My renewed interest in proof of exponent laws came from reading the
Harvard Entrance Exam, 1869, which David Robert Palmer painstakingly typed up. In the exam there were sections of Latin and Greek, History and Geography, are various mathematics. It's fascinating that the words in Latin and Greek are already provided, so knowing how to conjugate is all that is required. The maths section are heavily calculation based. It seems that these sections serve to test only how nimble the minds are in solving known problems, but not so much whether the students exhibit the thought process of solving unknown problems. But one question intrigued me.
Question (3) in the algebra section reads, "What is the reason that when different powers of the same quantity are multiplied together their exponents are added?" It is asking about the exponent law for product of powers, \( a^x a^y = a^{x+y} \). Knowing the prestige of Harvard, giving an informal answer is likely not satisfactory. But most of the so called "proofs" that are prevalent in textbooks essentially boil down to the following reasoning.
\[ \underbrace{a a \dots a }_{x \text{ times}} \cdot \underbrace{a a \dots a}_{y \text{ times}} = \underbrace{a a a a \dots a a }_{x + y \text{ times}} \]
This appeals to the intuitive understanding that \( a^x \) is \(a\) multiplied \(x\) times, so when you multiply the result also by \(a\) for another \(y\) times, you get \( a^{x+y} \). This is hardly satisfying as a proof.
Another "proof" is to reduce this to the additive law of logarithms \( \log{xy} = \log{x} + \log{y} \), but this is like throwing the hot potato to someone else.
\[ \log{a^x a^y} = \log{a^x} + \log{a^y} = x + y \\
a^{\log{a^x a^y}} = a^{x+y} \]
And since \( a^{\log{c}} = c \) i.e. log is the inverse of exponent, so \( a^{\log{a^x a^y}} = a^x a^y = a^{x+y} \).
There are other
exponent laws without formal proofs, but for this article I'm going to focus on the sum of powers.
Formal proof
We appeal to proof by induction for the formal proof. First we establish the base case for exponentiation:
\[ a^0 = 1 \]
Also recall that \(1\) is the multiplicative identity, so \( x\cdot 1 = 1\cdot x = x \), and \(0\) is the additive identity, so \( x+0 = 0+x = x \). From this, we can prove the base case where \( y = 0 \), i.e. \( a^x a^0 = a^x \cdot 1 = a^x = a^{x+0} \).
Now for the inductive case, we need both the axiom of exponentiation \( a^x \cdot a = a^{x+1} \) and the induction hypothesis \( a^x a^y = a^{x+y} \). We proceed to show that \( a^x a^{y+1} = a^{x+y+1} \).
\[ a^x (a^{y+1}) \underbrace{=}_{\text{axi.}} a^x (a^y a^1) \underbrace{=}_{\text{assoc.}} (a^x a^y) a^1 \underbrace{=}_{\text{I.H.}} a^{x+y} a^1 \underbrace{=}_{\text{axi.}} a^{(x+y)+1} \]
In the equations above, "axi." means the axiom of exponentiation, "I.H." means application of induction hypothesis, and "assoc." means
associativity (which is true of all
multiplicative groups, e.g. matrix multiplication).
Having painstakingly established a formal proof of the product of powers, there is a notable limitation, namely that we only allow natural number exponents, \( \{x, y\} \subseteq \mathbb{N} \). In particular, we don't allow negative (e.g. -3) or fractions (e.g. 3.14159...). Indeed, the real number domain has declarative properties that evade the mere application of modus ponens in discrete logic, which gives rise to the famous satire
Alice in Wonderland.
Proof sketch in real number domain
Fortunately, all is not lost in the real number domain. We can invoke
Taylor's theorem to expand the infinite series for calculating \( e^x \) and \( e^y \), multiply the two series, and see if \( e^{x+y} \) expands to the result of the multiplication.
The Taylor series for \( e^x \) is:
\begin{array}{rcl}
e^x & = & \sum^{\infty}_{n=0} {x^n \over n!} \\
& = & 1 + {x^1 \over 1!} + {x^2 \over 2!} + {x^3 \over 3!} + {x^4 \over 4!} + \dots \\
& = & 1 + x + {x^2 \over 2} + {x^3 \over 6} + {x^4 \over 24} + \dots
\end{array}
The following table multiplies the Taylor series for \( e^x \) and \( e^y \).
\begin{array}{c|cc}
\times & 1 & x & x^2 \over 2 & x^3 \over 6 & x^4 \over 24 & \dots \\ \hline
1 & 1 & x & x^2 \over 2 & x^3 \over 6 & x^4 \over 24 & \dots \\
y & y & xy & x^2 y \over 2 & x^3 y \over 6 & x^4 y \over 24 & \dots \\
y^2 \over 2 & y^2 \over 2 & xy^2 \over 2 & x^2 y^2 \over 4 & x^3 y^2 \over 12 & x^4 y^2 \over 48 & \dots \\
y^3 \over 6 & y^3 \over 6 & xy^3 \over 6 & x^2 y^3 \over 12 & x^3 y^3 \over 36 & x^4 y^3 \over 144 & \dots \\
y^4 \over 24 & y^4 \over 24 & xy^4 \over 24 & x^2 y^4 \over 48 & x^3 y^4 \over 144 & x^4 y^4 \over 576 & \dots \\
\vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \ddots
\end{array}
The Taylor series for \( e^{x + y} \), expanded individually, is:
\begin{array}{c}
1 &+& (x+y) &+& {(x+y)^2 \over 2} &+& {(x+y)^3 \over 6} &+& {(x+y)^4 \over 24} &+& \dots \\
& & || & & || & & || & & || \\
& & x & & x^2 \over 2 & & x^3 \over 6 & & x^4 \over 24 \\
& & y & & xy & & x^2 y \over 2 & & x^3 y \over 6 \\
& & & & y^2 \over 2 & & xy^2 \over 2 & & x^2 y^2 \over 4 \\
& & & & & & y^3 \over 6 & & xy^3 \over 6 \\
& & & & & & & & y^4 \over 24
\end{array}
As you can see, the expansion of each term corresponds diagonally to the multiplication table above, so we can indeed show by Taylor expansion that \( e^x e^y = e^{x+y} \). This is only a proof sketch, but the enterprising mind could organize the table into a
formal proof, using
Binomial Theorem and
Cauchy Product.
You might have noticed that we have only established the product law at base \(e\) but not other bases. For the other bases, recall that \( a^x = e^{x \ln a} \), so we have:
\[ a^x a^y = e^{x \ln a} e^{y \ln a} = e^{(x \ln a) + (y \ln a)} = e^{(x+y)\ln a} = a^{x+y} \]
Conclusion
There are two reasons why I suspect the Harvard entrance exam only expected the informal reasoning of counting the number of multiplicands.
Cauchy (1789-1857) product is a fairly recent development to the time of the exam in 1869, which makes it unlikely to become part of the curriculum. There were no other questions that require the use of proof by induction so that it is also unlikely to be part of the curriculum either.
On the other hand, it is only after I became older (and overeducated) that I realize a lot of maths I was taught in grade school were really just informal proof sketches, and even so I didn't learn to question the profoundness of their limitations. The informal proof sketch of counting the multiplicands is limited to exponent in the natural number domain and would not work for real number exponents. I spent the better parts of my youth learning faux math!
Laws in the real number domain cannot be proven by counting alone because
real numbers are uncountable. For this case I appealed to Taylor's Theorem, first expanding the polynomial series and showing the combined series are equal. I suspect many of the real number laws could use the same treatment.