# Latest content was relocated to https://bintanvictor.wordpress.com. This old blog will be shutdown soon.

## Friday, October 17, 2014

### (reverse) transforming a N -> LN random var

A LogNormality random var ---- goes through the log() transform ----> a Normality random variable
A Normality random var ---- goes through the exp() transform----> a LogNormality random variable

In short form,
LN --log()--> N
N --exp()--> LN

These are the 2 "Tanbin" rules to internalize. One of them is just a variation of the sound byte "LogNormal means log of it is Normal"

Some self-quizzes --

Q: how can log() help transform a LN var to a N var?
Q: what if I apply exp() on a LN var?
A: unfamiliar distribution. Note exp() can accept -inf to +inf, underutilized by the input LN variable.
A: N --exp()--> LN --exp()--> ... Note you are double-hitting exp()

Q: how can exp() help transform a LN var to a N var?
Q: what if I apply log() on a LN var?
Q: given a LN var, how to get a N var?
Q: how can log() help transform a N var to a LN var?
Q: how can exp() help transform a N var to a LN var?
Q: given a N var, how to get a LN var?
Q: how can a N var be transformed to a LN var?
Q: what if I apply exp() on a N var?
Q: how can a LN var be transformed to a N var?
Q: what if I apply log() on a N var?
A: crash and burn! N variable can be -inf, breaking log()