Continuous probability distribution combines the possible values of a continuous random variable with corresponding probabilities. While theoretical and discrete probabilities are used in discrete distributions, probability density function f(x) is used for computing probabilities in continuous distributions. The pdf of a continuous random variable X f(x) ≥ 0 for all x. The probabilities for the distribution are calculated using definite integral as
P( a ≤ X ≤ b) = $\int_{a}^{b}f(x)dx$ and the probability for the definite outcome is given by $\int_{-\infty }^{\infty }f(x)dx$ = 1.
As a continuous variable can assume values only as intervals, the probability of the variable assuming a specific value in the interval is zero.
P( X = c) = $\int_{c}^{c}xf(x)dx$ = 0.
The mean and variance of a continuous probability distribution is given by,
Mean μ = Expectation E(x) = $\int_{-\infty }^{\infty }xf(x)dx$ and Var (X) = [E(x)]2 - μ2.

Here is a list of some of well known univariate distributions.
  1. Uniform distribution
  2. Normal distribution
  3. Exponential distribution
  4. Lognormal distribution
  5. Beta distribution
  6. Gamma distribution
  7. t- distribution
  8. Rayleigh distribution
  9. chi-squared distribution
  10. Logistic distribution
  11. Weibull distribution
In a Uniform distribution, the probability density is uniformly distributed over a range of outcomes. A continuous random Variable X with uniform distribution is represented by the symbol X∼U(a,b).

The probability distribution function for the uniform distribution f(x) = $\frac{1}{b-a}$
The mean and variance of a uniform distribution are

Mean μ = $\frac{a+b}{2}$ Var (X) = $\frac{(b-a)^{2}}{12}$
Integration is the process used for continuous distributions, equivalent to summation in random variables to determine the cumulative probability. This equivalence is applied in defining a continuous Poisson distribution in terms of Gamma functions.
Probabilities in multivariate continuous distributions are found using multiple integration. The probability distribution function f(x, y) of a two dimensional continuous random variable (X, Y) is called the joint probability distribution of (X, Y) if
  1. f(x,y) ≥ 0 for all x,y
  2. $\int_{-\infty }^{\infty }\int_{-\infty }^{\infty }f(x, y)dxdy$ = 1

P(a ≤ x ≤b, c ≤ Y ≤ d) = $\int_{a}^{b}\int_{c}^{d}f(x, y)dxdy$


Solved Example

Question: The length of time (in minutes) a customer care executive speaks on the telephone is found to be a random variable with pdf

f(x) = $Ae$$^{\frac{-x}{2}}$              for x > 0

      = 0                              Otherwise
  1. Find the value of A.
  2. Find the probabilities and executive will talk more than 10 minutes, between 5 to 10 minutes and less than 5 minutes

Solution:
 
$\int_{0}^{\infty }$$Ae$$^{\frac{-x}{2}}dx$ = 1 f(x) is a pdf.

$[$$-2Ae$$^{\frac{-x}{2}}]_{0}^{\infty }$ =1

-0 -(-2A) = 2A = 1 Hence A = $\frac{1}{2}$

P(X < 5) = $\int_{0}^{5}\frac{e^{\frac{-x}{2}}}{2}dx$ = $[-e^{\frac{-x}{2}}]_{0}^{5}$ = 0.918

P(5 = X = 10) = $\int_{5}^{10}\frac{e^{\frac{-x}{2}}}{2}dx$ = $[-e^{\frac{-x}{2}}]_{5}^{10}$ = 0.075

P( X > 10) = $\int_{10}^{\infty }\frac{e^{\frac{-x}{2}}}{2}dx$ = $[-e^{\frac{-x}{2}}]_{10}^{\infty }$ = 0.007