# Log probabilities trick

It is often the case in calculating likelihoods, that the probabilities can be too small for computational stability. Then it makes sense to work with the log of probabilities. However here one will immediately encounter another problem: how to get log(p + q) from log(p) and log(q)?

Richard Durbin et al. explain a trick for this in Biological sequence analysis: Probabilistic models of proteins and nucleic acids (1998), section 3.6.

$\log(p + q) = \log(p ( 1 + \frac{q}{p}))$
$= \log(p) + \log(1 + \exp(\log(\frac{q}{p})))$
$= \log(p) + \log(1 + \exp(\log(q) - \log(p)))$

Then if p is chosen as the larger of p and q, Durbin et al. argue that using a table of interpolations for calculating log(1 + exp(x)) gives very close estimates for log(p + q).