Patrick told me about this interesting paradoxical result in statistics called Stein’s example.

Charles Stein is a statistician who wrote a 1956 paper that challenged the conventional wisdom about multiparameter estimation.  If you want to estimate the mean of a Gaussian distribution, the usual estimator is the average of the observed values. Gauss actually showed that expected squared error of estimation for the average is less than for any other linear, unbiased function of the observed values.

But if you want to estimate three or more Gaussian distributions at once, Stein showed that there is an estimator always that beats the averages in terms of total squared error.  Stein’s estimator is also called the James-Stein estimator.  This estimator involves moving the individual averages toward the global average based on a scaling coefficient.  It’s somewhat counterintuitive that you could do better than the individual averages.  There’s a 1977 paper from Scientific American, Stein’s paradox in Statistics, that helped me understand the problem enough to at least describe it.

Here’s a image from that paper, that shows the scaling coefficient and how it changes with the standard deviation of each component.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s