## The Shortest Path To Trigonometric Identities

```How can we motivate the derivations and proofs of trigonometric
identities?  Jacques Hadamard said "The shortest path between two
truths in the real domain sometimes passes through the complex
domain", and this is certainly true when it comes to verifying
trigonometric identities such as

A + B      A - B
2 sin -----  cos -----    =    sin A + sin B
2          2

Going from the left side of the equation to the right is just a matter
of multiplying through using the exponential forms of sin() and cos()
as follows
_                         _   _                          _
|   i(A+B)/2     -i(A+B)/2  | |    i(A-B)/2     -i(A-B)/2  |
|  e         -  e           | |   e         +  e           |
2 | ------------------------  | |  ------------------------  |
|_           2i            _| |_             2            _|

iA     -iA       iB    -iB
e   -  e         e   - e
=   -----------   +  ----------
2i               2i

This is perfectly straightforward.  However, going from the right
side to the left side is essentially a task of factorization.  In
for a "converted" version of sin(x)+sin(y)  is analagous to asking
for a factored version of, say, 2258745004684033.  In contrast,
asking for a proof of the identity

sin(x)+sin(y) = 2sin((x+y)/2)cos((x-y)/2)

is analagous to asking for a proof of the identity

2258745004684033 = (27439297)(82317889)

A proof (or disproof) of a given proposition is generally easier than
constructing the proposition in the first place.

This is similar to integral formulas.  Given the integral of a function
it's usually easy to differentiate and find the function, but finding
the integral from the function is often much more difficult.  In this
sense, differentiation is to multiplication as integration is to
factoring.  Very often in situations like this we take the approach
of just differentiating every function we can think of, and write down
the derivatives in a table along with the function.  Then to find the
integral of a function we just look for that function in the derivative
column of our table.  A similar approach was used to construct the
tables of standard trigonometric identities.

This approach have very wide applicability.  I think it was Jacobi who
said "Always invert!", and this is certainly how he made most of
Legendre's work on eliptic integrals obsolete.  (This same approach
works outside of mathematics too.)

two very powerful rules are for producing useful results quickly are:
(1) Use complex numbers, and (2) always invert.  Another general and
highly efficatious approach to functional problems is to express them
in terms of power series.  The realization that a large class of
functions can be represented as power series was one of the most
significant turning points in the development of modern mathematics.

To show how this can be applied to the derivation of trigonometric
identities, suppose we want to find an expression for sin(a+b) in
terms of the sines and cosines of the individual numbers a and b.
For a general power series

f(x) = c_0 + c_1 x + c_2 x^2 + c_3 x^3 + ...

we have the derivatives

f'(x)  =  c_1 + 2c_2 x + 3c_3 x^2 + ...

f''(x)  = 2c_2 + 6c_3 x + ...

f'''(x)  = 6c_3 + ...

and so on.  It's clear that the nth derivative at x=0 is simply
(n!)c_n, and so the nth coefficient c_n equals the nth derivative
divided by n!.  Now recall that the derivative of the sine is the
cosine, and the derivative of the cosine is the negative sine, so
we can expand the function f(x) = sin(a+x) around the point x=0
as follows
cos(a)     sin(a)       cos(a)
sin(a+x)  =  sin(a) + ------ x - ------ x^2 - ------ x^3 + ...
1!         2!          3!

By the same reasoning we can write down the power series for cos(a+x)

sin(a)     cos(a)       cos(a)
cos(a+x)  =  cos(a) - ------ x - ------ x^2 + ------ x^3 - ...
1!         2!          3!

Not surprisingly, these expressions reduce to the familiar power series
for sin(x) and cos(x) if we set a=0.  Now, suppose we collect terms
in the series for sin(a+x) according to whether they contain a sine
or a cosine.  This gives
_                         _
|     x^2   x^4   x^6       |
sin(a+x) =   sin(a)| 1 - --- + --- - --- + ... |
|_     2!    4!    6!      _|

_                         _
|     x^3   x^5   x^7       |
+ cos(a)| x - --- + --- - --- + ... |
|_     3!    5!    7!      _|

Of course, we immediately recognize the quantities inside the brackets
as the cosine and sine of x, so we have the trigonometric identity

sin(a+b) = sin(a)cos(b) + cos(a)sin(b)

Similarly if we separate the terms of the series for cos(a+x) we
arrive at
_                         _
|     x^2   x^4   x^6       |
cos(a+x) =   cos(a)| 1 - --- + --- - --- + ... |
|_     2!    4!    6!      _|

_                         _
|     x^3   x^5   x^7       |
- sin(a)| x - --- + --- - --- + ... |
|_     3!    5!    7!      _|

which gives the identity

cos(a+b) = cos(a)cos(b) - sin(a)sin(b)

This nicely illustrates how it is often possible to deduce closed-
form identities from consideration of the infinite series expansions
of the functions involved.
```