The outcomes in sample space of a probability experiment can be either numerical or non numerical. Random variables makes it possible to assign numerical values even to non numerical outcomes.

For example the outcomes in the experiment of throwing a die are all numerical that is,
S = {1, 2, 3, 4, 5, 6} while the outcomes in tossing a coin are non numerical where S = {Head, Tail}.

But we can assign values Head = 1 and Tail = 0 and make the sample space a numerical set. We can use function like notation to describe the assigning process as X(Head) = 1 and X(Tail) =0. The function X in this case is called a random variable.

In case the outcomes of a sample space are numerical, then the numerical values themselves serve as the values for the random variable.

## Definition of Random Variable

Random variable is a function that assigns real numbers to the outcomes of a random experiment. It is a real valued function defined on sample space.If the outcomes themselves are numerical, then the random variable is defined by the sample space itself.
Random variable is represented by an upper case letter like X or Y.

## Types of Random Variables

The random variables are of two types, Discrete and Continuous random variables.

Discrete Random Variables:

Discrete random variables have a finite number possible values or an infinite number of values that can be counted like 1,2,3.
Example:
A three card Poker hand can have 0,1,2 or 3 Aces. Hence this is a discrete random variable.
X = {0, 1, 2, 3}

Continuous Random variable:
Variables that can assume all values in an interval between two given values are called continuous variable.
For example, if the temperature on a hot day has risen from 90º to 100º, it has passed through all values in the interval
[90, 100]. Continuous random variables can assume infinite number of values and also decimal and fractional values.
Data for a continuous random variable is measured and not counted as in the case of a discrete random variable.

## Functions of Random Variables

Probability functions of random variables associates the numerical values of the variable to respective probabilities.

Probability Mass function:
Let X = {x1, x2, ......xi,........ } be a discrete random variable. Then p(x) is a probability mass function of the random variable X, where
1. p(xi) ≥ 0 for i = 1, 2,.....
2. Σ p(xi) = 1.
Probability Density function:
A function f(x) is called the probability density function (pdf) of a continuous random variable if,
1. f(x) ≥ 0
2. $\int_{-\infty }^{\infty }f(x)dx$ = 1

Cumulative distribution function:
The function F(x) = P( X < x) is known as the cumulative distribution function (cdf) of the Random Variable.
F(x) = Σ p(x) when X is discrete
and
F(x) = $\int_{-\infty }^{x}f(x)dx$

A function of a random variable is also a random variable defined on the same sample space.
The probability distribution of Y can be got by replacing the x values in the probability distribution X with the corresponding Y values.

## Independent Random Variables

Let (X, Y) be a two dimensional discrete random variable with probability mass function p(xi, yj). Let p(xi) and q(yj) be the marginal probability functions of the random variables X and Y. Then the random variables X and Y are independent if
p(xi, yj) = p(xi) . q(yj)

In case the case of X and Y being continuous random variables they are independent if
f(x, y) = g(x).h(y)
where f(x, y) is the probability density function of the two dimensional continuous random variable and g(x) and h(y) are
marginal random variables of X and Y.

## Sum of Random Variables

The formulas for the expectations of a sum of random variables are as follows:
If (X,Y) is a two dimensional random variable defined over a sample space S, then the expectation of the sum is equal to the sum of the expectations.
E(X + Y) = E(X) + E(Y)
If X1, X2,..........Xn are n random variables defined over a sample space, then
E(X1 + X2 + ..........Xn) = E(X1) + E(X2) + ..............E(Xn).

## Product of Random Variables

If X and Y are independent random variables defined over a sample space S, then the expectation of the product is equal to the product of the expectations.

E(XY) = E(X) . E(Y)

If X1, X2,........Xn are n independent random variables defined over a sample space, then
E(X1, X2, ........Xn) = E(X1).E(X2).......E(Xn).

## Sequence of Random Variables

Suppose X1, X2,...Xn are n random variables defined on a sample space S. {Xn} is called a sequence of random variables if all the random variables are functions from S to R the set of real numbers.

In advanced statistics, the convergence of sequences of random variables are defined and discussed in detail.

## Transformation of Random variables

The formulas used for finding the probability functions of a random variable Y which is a function of another random variable X whose probability functions are known are as follows:
When X is a discrete random variable, and Y = h(X), the probability mass function of Y, pY(y) is given by
pY (y) = $\sum_{x\epsilon h^{-1}(y)}^{}$ pX (x)where px (x).

When X is a continuous random variable the probability density function (pdf) fY (y) is given by
f (y) = $f_{X}(g^{-1}(y))\frac{d}{dy}g^{-1}(y)$

## Properties of Random Variables

Expectation E is the mean value of a random variable probability distribution. The formulas for finding the expectation and variance of the distribution are as follows:
When X is a discrete random variable with possible values {x1, x2, .......xn} and if p(xi) = P(X = xi) i = 1, 2,....... then
E(x) = $\sum_{i=1}^{\infty }x_{i}p(x_{i})$

If X is a random variable with pdf f(x)
E(X) = $\int_{-\infty }^{\infty }xf(x)dx$

For both the cases the formula for variance is
Var (X) = E(x2) - [E(x)]2.
There are a number of properties related to expectation, variance and other moment of a random variable, listing which are beyond the scope of this page.

## Examples of Random Variables

### Solved Examples

Question 1: A random variable X has the following probability mass function

 x -3 -2 -1 0 1 2 3 p(x) $\frac{1}{9}$ $\frac{1}{9}$ $\frac{1}{9}$ $\frac{1}{9}$ $\frac{1}{9}$ $\frac{1}{9}$ $\frac{1}{9}$

Find the distribution function and probability mass function of Y = X2 + 3X -4
The random variable Y assumes the values  -4, -6, -6, -4, 0, 6 and 14 corresponding to X values -3, -2, -1, 0, 1, 2 and 3.
Solution:

As the probabilities are same, the probability mass function for Y is as follows:

 x -6 -4 0 6 14 p(x) $\frac{2}{9}$ $\frac{4}{9}$ $\frac{1}{9}$ $\frac{1}{9}$ $\frac{1}{9}$

Note the Y assumes the -6 corresponding to both X = -2 and -1.

Hence P(Y = -6) = P(X = -2) + P(X = -1) = $\frac{1}{9}$ + $\frac{1}{9}$ = $\frac{2}{9}$.
Similarly P(Y = -4) = P(X = -3) + P(X = 0) =  $\frac{1}{9}$$\frac{1}{3} = \frac{4}{9} The probability Distribution function gives cumulative probabilities  x < -6 < -4 < 0 < 6 < 14 ≥14 F(x) 0 \frac{2}{9} \frac{2}{3} \frac{7}{9} \frac{8}{9} 1 Question 2: The distribution function for a random variable is given by F(x) = 0 for x < - 1 = \frac{x+1}{2} for -1 ≤ x ≤ 1 = 1 for x ≥ 1 Find (1) P|X| < \frac{1}{2}. (2) P (2 < x< 3) Solution: (1) P|X| < \frac{1}{2} = P(-\frac{1}{2} < x, \frac{1}{2}) f(x) = F'(x) thus the pdf to be considered for this interval = \frac{d}{dx}\frac{x+1}{2} = \frac{1}{2} P|X| < \frac{1}{2} = P(-\frac{1}{2} < x, \frac{1}{2}) = \int_{-1/2}^{1/2}\frac{1}{2}$$dx$

= $\frac{x}{2}|_{-\frac{1}{2}}^{\frac{1}{2}}$ = $\frac{1}{2}$

(2) The pdf to be considered for P (2 < x< 3)  f(x) = F'(x) = $\frac{d}{dx}$ = 0
Thus P (2 < x< 3) = 0