diff --git a/preliminaries/index.md b/preliminaries/index.md index 68d937d..baefa44 100644 --- a/preliminaries/index.md +++ b/preliminaries/index.md @@ -181,7 +181,7 @@ slides: true
-

Ok, so that's why logarithms are historically important. Why do we still use them so much nowadays? We have calculators aplenty to multiply numbers for us, so why put logarithms front and center among our preliminaries?

There are two reasons. The first is in analysis. When we are analysing our formulas and algorithms, we deal with a lot of complicated expressions. Here is an example: the famous normal distribution. Its probability likelihood curve looks like this. It's defined by the complicated formula on the left (this normal function has mean 0 and variance 1).

The only thing we need to worry about now is that it's positive everywhere, so we can take its logarithm. This changes the function, but in a very predictable way: for instance, because the logarithm only ever increases, that means that where ever the normal function increases, so does its logarithm. This means that, for example, the peak is in the same place for both functions.

So, if we start with the complicated function on the left, and take its (natural) logarithm, we end up with the function on the left. See if you can show this with the properties from the previous slide, it's good practice.

In "log space" the function still has some complicated bits, like the part in gray, but we can note that these do not depend on x. It's just some number. That means that this logarithmic function is just a simple parabola: the most complicate part is just the square of x. Parabolas should be much more familiar to you than the complicated function on the left.

This is the first reason to use logarithms. For many of the functions we will want to analyse, taking their logarithm retains the important properties, but simplifies their expression.

+

Ok, so that's why logarithms are historically important. Why do we still use them so much nowadays? We have calculators aplenty to multiply numbers for us, so why put logarithms front and center among our preliminaries?

There are two reasons. The first is in analysis. When we are analysing our formulas and algorithms, we deal with a lot of complicated expressions. Here is an example: the famous normal distribution. Its probability likelihood curve looks like this. It's defined by the complicated formula on the left (this normal function has mean 0 and variance 1).

The only thing we need to worry about now is that it's positive everywhere, so we can take its logarithm. This changes the function, but in a very predictable way: for instance, because the logarithm only ever increases, that means that where ever the normal function increases, so does its logarithm. This means that, for example, the peak is in the same place for both functions.

So, if we start with the complicated function on the left, and take its (natural) logarithm, we end up with the function on the right. See if you can show this with the properties from the previous slide, it's good practice.

In "log space" the function still has some complicated bits, like the part in gray, but we can note that these do not depend on x. It's just some number. That means that this logarithmic function is just a simple parabola: the most complicate part is just the square of x. Parabolas should be much more familiar to you than the complicated function on the left.

This is the first reason to use logarithms. For many of the functions we will want to analyse, taking their logarithm retains the important properties, but simplifies their expression.

click image for animation