Now that we have assigned a number to the outcome of an event, we can
define an ``average'' value for the r.v. over the possible events.
This average value is called the * expectation value* for the random
variable **x**, and has the following definition:

One can define a unique, real-valued function of a r.v., which will also be a
r.v.
That is, given a r.v. **x**, then the real-valued function is also a
r.v. and we can define the expectation value of :

The expectation value of a linear combination of r.v.'s is simply the linear combination of their respective expectation values;

The expectation value is simply the ``first moment'' of the r.v., meaning that
one is finding the average of the r.v. itself, rather than its square or cube
or square root.
Thus the mean is the average value of the first moment of the r.v.,
and one might ask whether or not averages of the higher moments have any
significance.
In fact, the average of the square of the r.v. does lead to an
important quantity, the * variance*, and we will now define the higher
moments of a r.v. **x** as follows: