Sums of Powers

 

Let Sk(n) denote the sum of the kth powers of the first n integers. (For convenience we will define S0 = n+1.) Itís fairly easy to determine the explicit formula for these sums directly from the definition. For example, by the definition of the sum of cubes function we have

 

 

so it's clear that S3 is a polynomial in n of finite degree in n. Also, the asymptotic growth rate obviously doesn't exceed the 4th power of n, so we expect that S3 can be expressed in the form

 

 

for some constants A,B,C,D,E. We immediately see that E must be zero to give S3(0) = 0. Furthermore, from the defining relation we have

 

 

Expanding each of the binomials, collecting terms by powers of n, and setting the coefficient of each power to zero, we find that A = 1/4, B = 1/2, C = 1/4, and D = 0, so the formula for the sum of the first n cubes is

 

 

The formulas for the sums of other powers can be derived in the same purely algebraic way. (For a discussion of a generalization of this approach, see the note on Sums and Differences of Discrete Functions.) However, itís natural to wonder whether these formulas might be derivable in terms of the integral of xk from x = 0 to n, since evaluating this integral is trivial. In fact, we can see that the first term in the formula for S3(n) is simply the integral of x3 from x = 0 to n. (In a sense, this was the justification for our presumption that Sk(n) is a polynomial of degree k+1.) On the other hand, the summation has some additional terms that donít appear in the integral, so we canít simply equate the sum to the corresponding integral.

 

To see exactly how sums of powers are related to integrals of powers, consider the integral

 

 

Letting Cm,n denote binomial coefficients, we can expand the binomials and collect terms by powers of x to give

 

 

where we have omitted the arguments on Sk, with the understanding that they represent Sk(n). Carrying out the integrations on both sides, we get

 

 

This equation is valid for any non-negative integer k, so it gives the following infinite sequence of equations involving the sums of powers:

 

 

We can solving the first N of these equations to give expressions for S0, S1, Ö, SN-1. For example, to find formulas for S0 through S6, we simply solve the system of equations

 

 

Solving this system of equations by inverting the coefficient matrix on the left side, we get

 

 

Letting bk,j denote the element in the jth column of the kth row of the coefficient matrix, we can write the sums of the kth powers of the first n integers as

 

 

Thus we have

 

 

and so on. Expanding the binomials and simplifying, these expressions reduce to

 

 

and so on. It appears that Sk for every positive even value of k is divisible by the factors n, (n+1), and (2n+1). It would be interesting to know what other divisibility patterns these sums possess. Also, notice that the elements of the coefficient array satisfy the relation

 

 

and using this relation we can re-write equation (1) as

 

 

Therefore, letting Bk[x] denote the polynomial

 

 

we can write the sum of the kth powers of the first n non-negative integers succinctly as

 

 

Notice also that equation (2) can be written as

 

 

from which it follows that

 

 

Thus we can express all the bk,j values in terms of binomial coefficients and the values of b0,0, b1,0, b2,0, b3,0, and so on. These numbers are called the Bernoulli numbers, usually denoted by Bk = bk,0. To easily generate these numbers (without inverting large matrices), consider our previous expression for Sk in the special case of n = 0, so that Sk = 0 for all k > 0. In this case we have

 

 

Hence we have the recurrence relation

 

 

For example, given the values of B0 = 1, B1 = -1/2, B2 = 1/6, B3 = 0, we can compute B4 from the relation

 

 

Inserting the values of B0 through B3, we find that B4 equals -1/30.

 

The Bernoulli numbers were introduced by Jacques Bernoulli around 1690 in a short paper on computing the sums of powers. He was obviously pleased with his formulas when he wrote

 

With the help of [these formulas] it took me less than half of a quarter of an hour to find that the 10th powers of the first 1000 numbers being added together will yield the sum

††††††††††††††††††††††† ††††††††††††††† 91409924241424243424241924242500

From this it will become clear how useless was the work of Ismael Bullialdus spent on the compilation of his voluminous Arithmetica Infinitorum in which he did nothing more than compute with immense labor the sums of the first six powers, which is only a part of what we have accomplished in the space of a single page.

 

(One senses how distasteful Jacques found it to belittle one of his colleagues.) Even with the formula for the sums of 10th powers, itís impressive that Bernoulli could compute that number by hand in 7.5 minutes. Of course, using N=1000 in the base 10 probably made it easier. In any case, one must agree that the labor Bullialdus put into his tables of sums of powers was wasted, considering how easy it is to derive the explicit formulas for these sums.

 

Incidentally, given the formulas for the sums of powers, it follows that if f(k) is any polynomial function of k, we can easily express the sum of f(k) for k = 0 to n by considering the sums of each separate term, which are pure sums of powers. This is the origin of the Euler-Maclaurin formula, which is just a generalization of Bernoulliís formula, replacing xk with an arbitrary function f(x).

 

Return to MathPages Main Menu