LOTUS

by Daniel Pollithy

If you want to calculate the expected value of a continuous random variable which was transformed by a monotonic function, then the law of the unconscious statistician provides a convenient shortcut.

Transforming a random variable

We have a continuous random variable \(X\) with a probability density function \(f_{X}(x)\). This could for example be our last knowledge about the position of a robot.

Now, we apply a continuous, monotonic function \(g(\cdot)\) to the random variable. This could be a simple motion model for the robot.

If we want to calculate \(E[g(X)]\) then LOTUS tells us that we don’t have to solve \(g(X)\) but instead we can write:

\[E[g(X)] = \int_{+\infty}^{+\infty}{g(x) \cdot f_{X}(x) dx}\]

Proof

For the proof let’s assume \(g\) to be strictly increasing (decreasing would also be possible). And due to its continuity, \(g(x)\) therefore has a positive derivative for every x. This results in \(g(\cdot)\) being bijective therefore g is invertible. We call its inverse \(g^{-1}\). With \(g^{-1}: Y \rightarrow X\)

Prepare change of variable

The derivative of a function and its inverse are related. They are reciprocal:

\[\frac{dx}{dy} \cdot \frac{dy}{dx} = 1\] \[\frac{dx}{dy} = \frac{1}{\frac{dy}{dx}}\]

First we replace \(y\) with \(g(x)\) on the right side:

\[\frac{dx}{dy}= \frac{1}{\frac{d g(x)}{dx}}\]

Second we replace \(x\) with \(g^{-1}(y)\) on the right side:

\[\frac{dx}{dy}= \frac{1}{\frac{d g(g^{-1}(y))}{dx}}\]

Multiplying with dy on both sides:

\[dx = \frac{1}{\frac{d g(g^{-1}(y))}{dx}} dy\]

Expected value with exchanged variable

We start with the result:

\[\int_{+\infty}^{+\infty}{g(x) \cdot f_{X}(x) dx}\]

And now we can switch from x to y. First, replace g(x) with y. Second, replace \(f_{X}(x)\) with \(f_{X}(g^{-1}(y))\). And third, replace dx with the right hand side from “dx = …” above:

\[\int_{+\infty}^{+\infty}{g(x) \cdot f_{X}(x) dx} = \int_{+\infty}^{+\infty}{y \cdot f_{X}(g^{-1}(y)) \frac{1}{\frac{d g(g^{-1}(y))}{dx}} dy}\]

We have now switched to integrating over y.

Cumulative density function

\[F_{Y}(y) = Pr(Y \le y)\]

Apply g:

\[F_{Y}(y) = Pr(g(X) \le y)\]

Apply \(g^{-1}\) on both sides

\[F_{Y}(y) = Pr(X \le g^{-1}(y))\] \[F_{Y}(y) = F_{X}(g^{-1}(y))\]

Derivative of CDF

Now we can get the derivative of \(F_{Y}(y)\) for y in order to get \(f_{Y}(y)\). The chain rule is used. Note that this is the place where we need the derivative of \(g^{-1}(y)\) which is \(\frac{1}{\frac{d g(g^{-1}(y))}{dx}}\) !

Apply the CDF solution from above:

\[f_{Y}(y) = \frac{d}{dy} F_{Y}(y) = \frac{d}{dy} F_{X}(g^{-1}(y))\]

Apply the chain rule of derivation:

\[= f_{x}(g^{-1}(y)) \cdot \frac{d}{dy} g^{-1}(y)\] \[= f_{X}(g^{-1}(y)) \cdot \frac{1}{\frac{d g(g^{-1}(y))}{dx}} =\]

Plug-in

Two sections before, we got to this point:

\[E[g(X)] = \int_{+\infty}^{+\infty}{y \cdot f_{Y}(y) dy} = \int_{+\infty}^{+\infty}{y \cdot f_{X}(g^{-1}(y)) \frac{1}{\frac{d g(g^{-1}(y))}{dx}} dy}\]

Looking at the last formula from the section “Prepare change of variable”, we find that \(f_{X}(g^{-1}(y)) \cdot \frac{1}{\frac{d g(g^{-1}(y))}{dx}} dy\) to be the same as \(f_{X}(g^{-1}(y)) dx\)

\[E[g(X)] = \int_{+\infty}^{+\infty}{y \cdot f_{X}(g^{-1}(y)) dx}\]

Per definition \(g^{-1}(y)\) can be replaced by x.

\[E[g(X)] = \int_{+\infty}^{+\infty}{y \cdot f_{X}(x) dx}\]

And also \(y\) can be replaced by g(x).

\[E[g(X)] = \int_{+\infty}^{+\infty}{g(x) \cdot f_{X}(x) dx}\]