## What is expected value?

Expected value is an ‘average’ value but a special type of average value. The expected value of a random variable is its’ long-term average.

Suppose, we take a large number of experiments of a random variable, and each time we put numeric values to each possible outcome in those experiments. Let’s consider a 6 sided die and our random variable are the numbers obtained by rolling this fair die. The expected value of this random variable will be:

Image Source: Mathwords

This is a case of discrete random variable — where the number of potential outcomes is countable — by taking a sum in which each term is a possible value of the random variable multiplied by the probability of that outcome.

## What are the rules of expected value?

**#1** Let X be a discrete random variable with probability function fX(x). The expected value of X is

**[latexpage] $$ \begin{aligned} E\left( X\right) =\sum _{x}xf_{X}\left( x\right) =\sum _{x}xP\left( X= x\right) \end{aligned} $$**

**#2** If X be a discrete random variable, and g be a function of X. This can be expressed as: g(X). To find out the expected value of g(X) mean, to derive the long term average of X. Symbolically written as E(g(X)), it represents application of the function g to each of the observation of the value of the random variable X, N times. Suppose the value of random value of X occurring ‘N times’ are x1,x2,…,xN. Applying the function g to each of these observations, we get g(x1),…,g(xN). The long term average of g(x1),g(x2),…,g(xN) approaches E(g(X)) as the number of observations N tends to inﬁnity.

**[latexpage] $$ \begin{aligned}E\left( g\left( X\right) \right) =\sum _{x}g\left( x\right) f_{X}\left( x\right) \\ =\sum _{x}g\left( x\right) P\left( X=x\right) \end{aligned} $$**

with probability function probability function fX (x)

**#3** If X be a continuous random variable with p.d.f. fX(x). Then, expected value of X is

[latexpage] $$ E\left( X\right) =\int ^{\infty }_{-\infty }xf_{X}\left( x\right) dx $$

[latexpage] $$ E\left( g\left( X\right) \right) \dfrac {}{}=\int ^{\infty }_{\infty }g\left( x\right) f_{X}\left( x\right) dx $$

[latexpage] $$ \left| mu_{X}\right| Y=y = E\left( \left X\right| Y\right) =\left( Y\right) =\sum _{x}f_{X|Y}\left( x|y\right) $$

## What are the properties of Expected Value calculation

**#1** Let a and b be constants. For any random variable X (discrete or continuous), then

**[latexpage] $$ \ E\left( aX+b\right) =aE\left( X\right) +b $$**

**#2** Let g and h be functions, and let a and b be constants. For any random variable X (discrete or continuous), then

** [latexpage] $$ \begin{aligned}E\left\{ ag\left( X\right) +bh\left( X\right) \right\} =aE\{ g\left( X\right) +bE\left\{ h\left( X\right) \right\} \end{aligned} $$ **

**#3** Let X and Y be ANY random variables (discrete, continuous, independent, or non-independent). Then,

** [latexpage] $$ E\left( X+Y\right) = E\left( X\right) +E\left( Y\right) $$**

**#4** For ANY random variables X1, . . . , Xn,

** [latexpage] $$ E\left( X_{1}+\ldots +X_{n}\right) =E\left( X_{1}\right) +\ldots +E\left( X_{n}\right)$$**

**#5** Let X and Y be independent random variables, then

**[latexpage] $$ E\left( XY\right) =E\left( X\right) E\left( Y\right) $$**

**# 6** Let X and Y be independent random variables, and g, h be functions. Then,

**[latexpage] $$ E\left( g\left( X\right) h\left( Y\right) \right) =E\left( g\left( X\right) \right) E\left( h\left( Y\right) \right) $$**

**#7** If f(X) be a function of X, then E(f(X)|X) = f(X) which implies if X is known, X^3 is also known

**#8** If f(X) and g(X) are functions of X, then

**[latexpage] $$ \begin{aligned}E\left[ \left f\left( X\right) Y+g\left( X\right) \right| X\right] =f\left( X\right) E\left( \left Y\right |X\right) +g\left( X\right) \end {aligned}$$**

**#9** If all the expectations below are ﬁnite, then for ANY random variables X and Y , we have:

**[latexpage] $$ E\left( X\right) =E_{Y}\left( E\left( \left X\right| Y\right) \right)$$**

This is known as law of total expectation or iterated expectations. It simply means unconditional expectation of X is equal to the expectations of its conditional expectation.

The total average is E(X) ; The case-by-case averages are E(X |Y ) for the diﬀerent values of

Y ; The average of case-by-case averages is the average over Y of the Y -case averages.

**#10** For any function g,

**[latexpage] $$ E\left( g\left( x\right) \right) =E_{y}\left( E\left( \left g\left( x\right) \right| Y\right) \right) $$**

**Rule 7,8,9 and 10 are used in solving Unconditional Expectation problems. **

## How to solve questions of Expected value in examination without fearing the bulky equation signs ?

Please remember the following points before solving questions :-

- Get comfortable with the FOIL principle of summation sign in Algebra.
- Find out the nature of random variable – Discrete or Continuous.
- Whether there are one or more variables involved.
- If there are two or more variables involved, create an expected value table.
- You can use contingency table also.
- Use their probability density function and calculate the expected value.

**Example:**

We have provided the example for Discrete Random Variable. Now, let’s turn to a Continuous Random Variable case. I am assuming you are familiar with integration and probability density functions of continuous random variables or else learn from this video and then come back for the exercise.

[latexpage]$$\begin{aligned}f\left( x\right) =\dfrac {x^{3}}{9},0\leq x\leq 3\\ E\left( X\right) =\int ^{3}_{0}x\left( \dfrac {\dfrac {x^3}}{9}\right) dx=\dfrac {1}{9}\left[ \left( \dfrac {x^{3}+1}{3+1}\right) \right] ^{3}_{0}\\ =\dfrac {1}{9}\left( \dfrac {x^{4}}{4}\right) \right| ^{3}_{0}=\dfrac {1}{9}\left( \dfrac {3^{4}}{4}\right) -\dfrac {1}{9}\left( \dfrac {0^{4}}{4}\right) =\dfrac {9}{4}=2.25\end{aligned}$$

If you want to practice more questions please visit here

Sometimes, the examiner tests for expected values related to the more common discrete probability functions: binomial, geometric, hyper-geometric, and Poisson distribution function. Remember, all the probability distribution functions are patterns, used to convert a probability problem into a **pattern** or distribution for easy calculations. These distribution functions are your handy tools, with peculiar characteristics helping you to carry out your task. You try to guess the random variable’s probability funtion and use their formula for expected value.