The most common case is maximum likelihood estimationwhere the maximum likelihood estimate MLE maximizes likelihood minimizes expected surprisalwhich can be interpreted geometrically by using entropy to measure variation: the MLE minimizes cross entropy equivalently, relative entropyKullback—Leibler divergence.
Bill Gates walks into the bar with a talking parrot perched on his shoulder. One of the median's advantages is that it is not sensitive to outliers. Observations that are significantly larger or smaller than the others in a sample can impact some statistical measures in such a way as to make them highly misleading, but the median is immune to them.
In other words, one extraordinarily high or low value in your dataset can dramatically raise or lower the mean. The mean, often shown as an x or a y variable with a line over it pronounced either "x-bar" or "y-bar"is the sum of all the scores divided by the total number of scores.
Thus standard deviation about the mean is lower than standard deviation about any other point, and the maximum deviation about the midrange is lower than the maximum deviation about any other point.
Fifty percent of the observations fall below it.
Information geometry[ edit ] The notion of a "center" as minimizing variation can be generalized in information geometry as a distribution that minimizes divergence a generalized distance from a data set. Unlike the median, however, the mean is sensitive to outliers.
Correspondingly, the mode is not unique — for example, in a uniform distribution any point is the mode.