Conditional expectation is the expected value of a real random variable with respect to its probability distribution. It can be found for discrete as well as for continuous case. Conditional expectation is also known as conditional mean or conditional expected value.
In general to compute conditional expectation of $X$ given $Y$ = $y$ does not require that the two variables form a discrete or an absolutely continuous random vector, as it is applicable to any random vector. 

For discrete as well as continuous case conditional expectation definitions are explained below:
Definition of conditional expectation for discrete case
Consider two discrete random variables $X$ and $Y$ for some x $\in$ Range$(X)$, the conditional expectation of $Y$ given $X$ = $x$ is
$E( Y / x = x)$ = $y \times p$ $\sum_{y\ \in\ Range(y)} y*p_\frac{y}{X = x(y)}$

$E(X / Y = y)$ can be defined similarly.
Definition of conditional expectation for continuous case
Suppose $X$ and $Y$ are two absolutely continuous random variables. Let $R_{X}$ be the support of $X$ and let $f_X / Y= y(x)$ be the conditional probability density function of $X$ given $Y$ = $y$.
The conditional expectation of $X$ given $Y$ = $y$ is

$E[X/ Y = y]$ = $\int_{-\infty}^{\infty}x f_\frac{X}{Y = y(x)}$dx

provided $\int_{-\infty}^{\infty}|x|f_\frac{X}{Y=y(x)}$$dx$ < $\infty$
Given below are some of the properties related to the linearity of the expected value.

1) If $X$ is a random variable and $a$ $\in$ $\mathbb{R}$ then
  $E[aX]$ = $aE[X]$
This is known as the scalar multiplication of a random variable.

2) If $X_{1}, X_{2}, ..........., X_{k}$ are K random variables then
$E$[$X_{1}$ + $X_{2}$  + ......... + $X_{k}$] = $E$[$X_{1}$] + $E$[$X_{2}$]+ ...........+ $E$[$X_{k}$]
This is known as the sum of random variables.

3) If $X_{1}, X_{2}, ..........., X_{k}$ are K random variables and $a_{1}, a_{2}, ......., a_{k}$ $\in$ $\mathbb{R}$  are $K$ constants then
$E$[$a_{1}X_{1}+a_{2}X_{2}+......+ a_{k}X_{k}] = a_{1}E[X_{1}] + a_{2}E[X_{2}] + ........
+ a_{k}E[X_{k}$]
This is known as the linear combination of random variables.

4) Let $X$ be an integrable random variable defined on a sample space $\Omega$  Let $X(w)$ $\geq$  0 for all $w$ $\in$ $\Omega$
Then $E[X]$ $\geq$ 0
Expected value of $X$ should always be positive. Expected value is the lebesque integral of $X$.
Example 1: Find the missing probability in the following distribution and also find $E(X)$?
 2 3
 $\frac{3}{8}$  $\frac{1}{4}$  - $\frac{3}{16}$ $\frac{1}{16}$

Solution: Let the missing probability be $k$
For a probability distribution , $\sum$ $p(x)$ = 1

Therefore $\frac{3}{8}$ + $\frac{1}{4}$ + $k$ + $\frac{3}{16}$ +$\frac{1}{16}$ = 1

$k$ + $\frac{14}{16}$ = 1

$k$ = 1 - $\frac{14}{16}$

$k$ = $\frac{1}{8}$

Therefore, the probability distribution is:
 0  1 2
$p(x)$ $\frac{3}{8}$
 $\frac{1}{4}$ $\frac{1}{8}$  $\frac{3}{16}$ 

Thus $E(x)$ = $\sum$ $x \times p(x)$

= 0 $\times$ $\frac{3}{8}$ + 1 $\times$ $\frac{1}{4}$  + 2 $\times$ $\frac{1}{8}$ +  3 $\times$ $\frac{3}{16}$ + 4 $\times$ $\frac{1}{16}$

= $\frac{21}{16}$

Example 2: In a lottery there are 1000 tickets costing Rs 1 each. There is one first prize worth Rs. 100/- two second prizes worth Rs 20/- each and ten third prizes worth Rs 10/- each. Find the expected loss in buying one ticket?

Solution: Let $X$ : Amount that one ticket fetches after deducting the purchase cost
$X$ is a random variable which takes the values 99, 19, 9 and - 1 with respective probabilities.
p(99) = P[1st Prize]  = $\frac{1}{1000}$
= 0.001

p(19) = P [2nd Prize] = $\frac{2}{1000}$
 = .002

p(9) = P[3rd Prize]  = $\frac{10}{1000}$
= 0.01

p(- 1) = P[No Prize]  = $\frac{987}{1000}$
= 0.987

Thus expectation of the net amount
$E(X)$ = $\sum$ $x . p(x)$
= 99 $\times$ 0.001 + 19 $\times$ 0.002 + 9 $\times$  0.01 + (-1) $\times$ 0.987
= - 0.76
= Rs. 0.76 (loss)