Binomial distribution is the discrete probability distribution of the number of successes in a sequence of n independent trials each of which yields success with probability p and failure with probability 1 - p. Usually we denote success as 1 and failure as 0. Each observation will represent one of two outcomes and the number of observations n is fixed. When n = 1, a success(failure) experiment will be a Bernoulli trial(experiment) and the binomial distribution will turn to Bernoulli distribution and the probability of success p will be same for each outcome.

The probability of success, p, is the same for every trial. Here in this distribution the successive trails will be independent, if the trials are not independent then the resulting distribution will be a hypergeometric distribution. Binomial distribution is widely used and very popular.

The random variable X that counts the number of successes, x, in the n trials is said to have a binomial distribution with parameters n and p. This is denoted by bin(x ; n , p).
The probability mass function of a binomial random variable X with parameters n and p is given as:

$ f(x) = P(X=x) = ^{n}C_{x}\ p^{x}(1-p)^{n-x}$ ; x = 0, 1, 2, 3, ...., n
$^{n}C_{x}$ : Number of outcomes which includes exactly x successes and n - x failures.
Binomial distribution table computes the probability of obtaining x successes in n trials of a binomial experiment with probability of success p.

Binomial Distribution Table
The probability mass function of a binomial random variable X with parameters n and p is given as:
$ f(x) = P(X=x) = ^{n}C_{x}\ p^{x}(1-p)^{n-x}$ ; x = 0, 1, 2, 3,...., n
$^{n}C_{x}$ : Number of outcomes which includes exactly x successes and n - x failures.

If X $\sim$ Bin(n,p) then the expected value of X is E[X] = np

To Prove E[X] = np

Consider Mean =$\sum_{x=0}^{n}x p(x)$

$\sum_{x=0}^{n}x$ * $^{n}C_{x}p^{x}(1-p)^{n-x}$

$\sum_{x=0}^{n}x$$\frac{n!}{x!(n-x)!}$$p^{x}(1-p)^{n-x}$

$\sum_{x=0}^{n}x$$\frac{n(n-1!)}{x(x-1)![(n-1)-(x-1)]!}$$p^{x}(1-p)^{n-x}$

$\sum_{x=1}^{n}$$\frac{n(n-1)!p.\ p^{x-1}(1-p)^{[(n-1)-(x-1)]}}{(x-1)![(n-1)-(x-1)]!}$

np$\sum_{x=1}^{n}$$\frac{(n-1)!}{(x-1)![(n-1)-(x-1)]!}$$p^{x-1}(1-p)^{[(n-1)-(x-1)]}$

np$\sum_{x=1}^{n}^{n-1}C_{x-1}p^{x-1}(1-p)^{[(n-1)-(x-1)]}$

np$\sum_{x=1}^{n}^{n-1}C_{x-1}p^{x-1}(1-p)^{[(n-1)-(x-1)]}$

$np (p+q)^{n-1}$

$np(1)^{n-1}$ (Because $p + q = 1$)

np

Therefore, Mean = E[X] = np
Variance for the binomial distribution is: Var(X) = npq

Var(X) = $E(X^{2})-E[(x)]^{2}$ (Mean E[X] = np)

= $E(X^{2})- n^2p^2$ $\rightarrow$ equation 1

Consider $E(X^{2})$ = $\sum_{x=0}^{n}x^{2}p(x)$

$\sum_{x=0}^{n}[x(x-1)+x]p(x)$

$\sum_{x=0}^{n}[x(x-1)p(x)]+\sum_{x=0}^{n}xp(x)$

$\sum_{x=0}^{n}x(x-1)^{n}C_{x}p^{x}(1-p)^{n-x}+np$

$\sum_{x=0}^{n}x(x-1)$$\frac{n!}{x!(n-x)!}$$p^{x}(1-p)^{n-x}+np$

$\sum_{x=0}^{n}x(x-1)$$\frac{n(n-1)(n-2)!}{x(x-1)(x-2)!(n-x)!}$$p^{2}p^{x-2}(1-p)^{n-x}+np$

n(n-1)$p^{2}\sum_{x=2}^{n}$$\frac{(n-2)!}{(x-2)![(n-2)-(x-2)]!}$$p^{x-2}(1-p)^{(n-2)-(x-2)}+np$

n(n-1)$p^{2}(p+q)^{n-2}$+np

n(n - 1)$p^{2}$ + np

So now equation 1 becomes,

Var(X) = $E(X^{2})- n^2p^2$

= $n^{2}p^{2}-np^{2}+np-n^{2}p^{2}$

= np(1 - p)

Var(X) = npq

Therefore, Var(x) = npq

Standard deviation of Binomial distribution
Let X be a discrete random variable with the binomial distribution with parameters n and p.
Then, the standard deviation of X is given by:

Standard deviation (σ) = $\sqrt{variance(x)}$

So the Standard deviation of Binomial distribution is (σ) = $\sqrt{[np(1-p)]}$
The cumulative distribution function of binomial distribution is given by

F(x ; n, p) = P(X$\leqslant$ x) = $\sum_{i=0}^{\left \lfloor x \right \rfloor}$$^{n}C_{i}\ p^{i}(1-p)^{n-i}$
where $\left \lfloor \right \rfloor$ is the floor under x

Regularized beta function of binomial distribution is

F(k;n, p) = $P(X\leq k) = I_{1-p}(n-k, k+1)$

= $(n-k)^{n}C_{k}\int_{0}^{1-p}t^{n-k-1}(1-t)^{k}dt$
The probability distribution of a negative binomial random variable is called a negative binomial distribution. The Pascal distribution and Polya distribution are the special cases of the negative binomial distribution and is denoted by X $\sim$ NB(r ; p).

Negative binomial distribution is a discrete probability distribution having the following properties:
  • Experiment consists of x repeated trials and each trial can have only two outcomes, which the outcomes are mostly called as success and failure.
  • Probability of success is represented by p and will be same for every trial.
  • The trials are independent and the experiment is continued until r successes are observed (r is known).
Notations:
k:Trails required to produce r successes.
r:Number of successes
p:Probability of success of each trial
q:Probability of failure of each trial (1-p)
$^{n}C_{r}$:combination of n things taken r at a time.
The probability density function of negative binomial distribution is given as
$f(k;r,p)=P(X=k)=\binom{k+r-1}{k}p^{k}(1-p)^{r} for k = 0, 1, 2,.....$

In probability theory, poisson binomial distribution is the sum of independent bernoulli trials which may not be identically distributed. Binomial distribution is the special case of Poisson binomial distribution and if X has the poisson binomial distribution with $p_{1}=p_{2}=......=p_{n}=p$ then X$\sim$ B(n,p).
As binomial distribution gives us the probability of r success out of n trials we place the values of r along the horizontal axis and values of P(r) on the vertical axis on the histogram.

Graph of Binomial distribution for n = 6 and P = 0.70, where r varies from 0 to 6.

Binomial Distribution Graph

Solved Examples

Question 1: A fair coin is tossed 6 times and the probability of heads on any toss is noted as 0.3. Calculate
  1. P(X = 2)
      2.   P(1 < X $\leq$ 5)
Solution:
 
The Probability mass function of binomial distribution is given by
$ f(x) = P(X=x) = ^{n}C_{x}\ p^{x}(1-p)^{n-x}$ ;  x = 0, 1, 2, 3, ...., n

X : Number of heads
Suppose now heads are taken as success then we have n = 6, x = 2 and p = 0.3, q = (1 - p) = (1 - 0.3) = 0.7

Now the Probability mass function for the given problem is

1.     P(X = 2) = $\binom{6}{2}(0.3)^{2}(0.7)^{4} = 0.324$

2.     Now for P(1 < X $\leq$ 5) = P(x = 2) + P(x = 3) + P(x = 4) + P(x =5)

Consider,

P(X = 2) = $\binom{6}{2}(0.3)^{2}(0.7)^{4} = 0.324$

P(X = 3) = $\binom{6}{3}(0.3)^{3}(0.7)^{3} = 0.185$

P(X = 4) = $\binom{6}{4}(0.3)^{4}(0.7)^{2} = 0.059$

P(X = 5) = $\binom{6}{5}(0.3)^{5}(0.7)^{1} = 0.01$

So, P(1 < X $\leq$ 5) = 0.324 + 0.185 + 0.089 + 0.01

= 0.58
 

Question 2: 10 fair coins are tossed simultaneously find the probability of getting
Exactly 7 heads
Atleast 7 heads
Atmost 7 heads
Solution:
 
Random Experiment : Tossing 10 coins simulataneously
 X : Number of heads obtained
X $\sim$ B(n, p), X $\sim$ B(10, 0.5), Given n= 10 and p = 0.5 (fair coin)
The probability mass function of a binomial random variable X with parameters n and p is given as:

$ f(x) = P(X = x) = ^{n}C_{x}\ p^{x}(1-p)^{n-x}$ ;  x=0,1,2,3,....,n    $0\leq p\leq 1$, q = 1 - p

1) Now to calculate P(Getting exactly 7 heads)

$P(x = 7)$ =$\binom{10}{7}$ $\ 0.5^{7}$ $\ 0.5^{3}$  = 0.1171

2) To calculate atleast 7 heads

P(X $\geq$ 7) = P(X = 7) + P(X = 8) +P(X = 9) +P(X = 10)

Consider
P(X = 8) = $\binom{10}{8}$ $\ (0.5)^{8}$ $\ (0.5)^{2}$ $\ =0.04394$

P(X = 9) =$\binom{10}{9}$ $\ (0.5)^{9}$ $\ (0.5)^{1}$ $\ =0.0095$

P(X = 10) =$\binom{10}{10}$ $\ (0.5)^{10}$ $\ (0.5)^{0}$ $\ =0.0009$

 Therefore, P(X $\geq$ 7) = P(X = 7) + P(X = 8) + P(X = 9) + P(X = 10)

                          = 0.1171 + 0.04394 + 0.0095 + 0.0009

                          =0.1719

3) To calculate atmost 7 heads

P(x $\leq$ 7) = 1 - P(X $\geq$ 7)
              = 1 - 0.1719
              = 0.9453