Skip to main content icon/video/no-internet

The expected value is the mean of all values of a random variable weighted by the probability of the occurrence of the values. The expected value (or expectation, or mean) of random variable (RV) X is denoted as E[X] (or sometimes μ).

Mathematical Definition

The RV, X, of a random experiment, which is defined on a probability space (Ω, Σ, P) on an underlying sample space Ω, takes value in event set Σ ⊆ R with certain probability measure P.

If X is a continuous random variable (i.e., Σ is an interval), the expected value of X is defined as

If X also has a probability density function (pdf) f(x) of certain probability distribution, the above expected value of X can be formulated as

The expected value exists if the above integral of absolute value of X is absolutely convergent, that is, is finite.

If X is a discrete random variable (i.e., Ω is countable) with probability mass or probability density function (pmf or pdf) p(x), the expected value of X is defined as

and the above expected value exists if the above sum of absolute value of X is absolutely convergent, that is, is finite.

As a simple example, suppose x can take on two values, 0 and 1, which occur with probabilities .4 and .6. Then

If G(X) is a function of RV X, its expected value, E[G(X)], is a weighted average of the possible values of G(X) and is defined as

The above expected value exists if the above sum of absolute value of G(X) is absolutely convergent.

Linear Relationship

If RVs X1, X2,…, Xn have the expectations as μ1, μ2,…, μn, and c1, c2, …, cn are all constants, then

Interpretation

From a statistical point of view, the following terms are important: arithmetic mean (or simply mean), central tendency, and location statistic. The arithmetic mean of X is the summation of the set of observations (sample) of X = {x1, x2, …, xN} divided by the sample size N:

is called arithmetic mean/sample mean/average when used to estimate the location of a sample. When it is used to estimate the location of an underlying distribution, is called population mean/average, or expectation/expected value, which can be denoted as E[X] or μ. This is consistent with the original definition because the probability of each value's occurrence is equal to 1/N. One could construct a different estimate of the mean, for example, if some values were expected to occur more frequently than others. Because a researcher rarely has such information, the simple mean is most commonly used.

Moment

The moment (a characteristic of a distribution) of X about the real number c is defined as

Hence, E[Xn] are also called central moments. E[X] is called the first moment (∵n = 1) of X about c = 0, which is commonly called the mean of X. The second moment about the mean of X is called the Variance of X. Theoretically, the entire distribution of X can be described if all moments of X are known by using the moment-generating functions, although only the first five moments are generally necessary to specify a distribution completely. The third moment is termed skewness, and the fourth is called kurtosis.

...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles

Sage Recommends

We found other relevant content for you on other Sage platforms.

Loading