Saturday, February 22, 2014

Log, log, it's big, it's heavy, it's wood

Every time I go for a jog, the first mile takes me 10 minutes. The second miles takes me 20 minutes. The third mile takes me 40 minutes. The next mile, of course, takes me 80 minutes. How long does it take me to finish my 10 mile jog? I think this "logarithmic jogging" fad is so silly...

That's why I got into "root jogging" instead! The first mile takes 10 minutes. The second mile takes 30 minutes. The third mile takes 50 minutes. The fourth miles takes 70 minutes... yet I finish my 10 mile jog so much faster!

How many intersections are there between y= x and y=lo g 2 ( x+1 ) ? A lot of students would look at this graph



and say "one." Just like the logarithmic vs. root jogging, students notice the initial growth rate but fail to continue the pattern. Doubling the time to jog each mile will make jogging the fifth mile much slower under logarithmic jogging (160 minutes) compared to root jogging (90 minutes). Come to think of it, how many will recognize the question is exactly the same as this: How many intersections between y= x 2 and y= 2 x -1 ?


On the other hand, it is easier to internalize and understand that exponential growth is insanely fast than it is to internalize that logarithmic growth is insanely slow.

To be fair, how logarithms work is pretty confusing. John Napier's purpose in developing logarithms was to simplify complicated multiplications, specifically, multiplications of long trigonometric decimals, such as sin(1)*cos(1). Before calculators, this kind of multiplication would be quite tedious. But logarithms turn multiplication into addition, so log[sin(1)*cos(1)] = log[sin(1)] + log[cos(1)]. All one needs to do is look up the logarithms of the two numbers in a table, add them together (much faster than multiplication), and then look up the inverse logarithm in a table.

The identity that logarithms turn multiplication into addition is quite mind-bending. Here is one way to demonstrate this fact: On a logarithm graph, a horizontal compression is the same thing as a positive vertical translation.


(Download the GeoGebra file here.)

In a pre-calculus class, I would want to establish that ln(x) is in in between reciprocal functions and root functions by graphing all three types, such as f(x)=-10(x)-110+10, g(x)=ln(x), and h(x)=10 x10 -10


The root function (in blue) grows without limit but has a finite value at x = 0. The ln(x) (in green) grows without limit but goes to negative infinity as x goes to 0. The reciprocal function (in red) grows to a horizontal asymptote and also goes to negative infinity as x goes to 0. Any reciprocal function has some horizontal asymptote. The ln(x) grows as slowly as possible without having an horizontal asymptote.

In a calculus class, once the students have seen l'Hopital's rule, they can prove that ln(x) fits in between these two types of functions. Both the reciprocal and root functions are of the form y=n x 1/n -n . Here is the short version: x= ( y+n n ) n
li m n ( y+n n ) n = e y =x
So y=ln(x).

For the same reason, one can show this:


One final thought about logs. In a B.C. Calculus course, I ask my students to look at the Taylor series for the reciprocal, logarithmic, and root functions I mentioned above.  The Taylor series are:

h(x)=10 x10 -10( x-1 )-0.45(x-1 ) 2 +0.285(x-1 ) 3 -0.207(x-1 ) 4
g(x)=ln(x)( x-1 )-0.5(x-1 ) 2 +0.333(x-1 ) 3 -0.25(x-1 ) 4
f(x)=-10x-0.1+10(x-1)-0.55(x-1)2+0.385(x-1)3-0.298(x-1)4

As you can see, the ln(x) fits right in between the other two.

One final thought on ln(x): a really neat property about its radius of convergence.

No comments:

Post a Comment