In mathematics, finite differences deals with the changes that takes place in the value of the function, the dependent variable, due to change in the independent variable. Finite difference is a mathematical expression of the form $g(x + b)$ - $g(x + a)$. It is a technique to find the approximate value of the differential equations. The FD approximation for the derivatives is one of the old and easy way to solve differential equations. And finite difference methods are usually explicit and easy to implement and really work good on rectangular, regular grids.

From the definition of derivative, we have:

$\frac{dg}{dx}$ = Lim$_{\Delta x->0}$ $\frac{g(\Delta x +x)-g(x)}{\Delta x}$

The value of $\frac{dg}{dx}$ can be approximate by using the finite difference, $\frac{g(\Delta x+x)-g(x)}{\Delta x}$ having small value of $\Delta x$.

Formulae for the forward, centered and backward finite difference approximations to the first derivative are given below:
Forward difference formula:

$\frac{dg^+}{dx}$ = $\frac{g(\Delta x +x)-g(x)}{\Delta x}$

Centered difference formula:


$\frac{dg}{dx}$ = $\frac{g(\Delta x +x)-g(x-\Delta x)}{2 \Delta x}$

Backward difference formula:

$\frac{dg^-}{dx}$
= $\frac{g(x) - g(x - \Delta x)}{\Delta x}$
The 2nd order forward, central and backward differences:
Forward difference:

g''(x) $\approx$ $\frac{g(x+ 2\Delta) + g(x) - 2g(x+\Delta)}{(\Delta x)^2}$

Central difference:


g''(x) $\approx$ $\frac{g(x+ \Delta) + g(x - \Delta) - 2g(x)}{(\Delta x)^2}$
nth order forward, backward, and central differences:
Forward difference:

g$^+$(x) = $\sum_{i=0}^n (-1)^i$ $\binom{n}{i}$ g(x + (n - 1)$\Delta$)

Central difference:


g(x) = $\sum_{i=0}^n (-1)^i$ $\binom{n}{i}$ g(x + ($\frac{n}{2}$ - i)$\Delta$)

Backward difference:


g$^-$(x) = $\sum_{i=0}^n (-1)^i$ $\binom{n}{i}$ g(x - i$\Delta$)
Lets derive the finite difference equation for the first derivative of the function form Taylor's series:
We know that expansion of Taylor' series:

g(x + $\Delta$ x) = g(x) + $\frac{g'(x)}{1!}$ . $\Delta$x + $\frac{g''(x)}{2!}$ . $\Delta x^2$ +....+ $\frac{g^{(n)}(x)}{n!}$ . $\Delta x^n$ + $R_n(x_o)$
Here last term, $R_n(x_o)$ is a remainder term. After truncating above polynomial, we have

$g$($x$ + $\Delta x$) = $g (x)$ + $g'(x)$$\Delta x$ + $R_1$$(x_o)$

After replacing $x$ with a, we get

$g$($a$ + $\Delta$ $x$) = $g(a)$ + $g'(a)$$\Delta$ $x$ + $R_1$$(x_o)$

Divide each side by $\Delta x$

$\frac{g(a + \Delta x)}{\Delta x}$ = $\frac{g(a)}{\Delta x}$ + $g'(a)$ + $\frac{R_1(x_o)}{\Delta x}$

Solve for $g'(a)$

$g'(a)$ = $\frac{g(a+\Delta x)-g(a)}{\Delta x}$ - $\frac{R_1(x_o)}{\Delta x}$

Let us assume that the value of $R_1$ is very very small, the approximation value of the first derivative of function $g$ is:
$g'(a)$ $\approx$ $\frac{g(a+\Delta x)-g(a)}{\Delta x}$
The approximation of finite difference analogs of 2nd order derivatives is a more complicated task than approximation of first derivatives.
Finite difference method is a method which generate approximate results to the differential equations using FDE (finite difference equations) to approximate derivatives. In other words we can say that, finite difference method is a means for finding numerical solutions to partial differential equations and linear complementary problems. Finite difference method can be abbreviated as FDM.
Example:

The following results are given: $\sqrt{9}$ = 3, $\sqrt{10}$ = 3.162, $\sqrt{11}$ = 3.316

Using above results, find $\sqrt{8}$

Solution: Let $g_x$ = $\sqrt{x}$ where x = 9, 10, 11, 12. And we need to find g$_8$

Finite Difference Method Example
Apply Newton's forward interpolation formula:

$g_x$ = $g_a$ + $u \Delta$ $g_a$ + $\frac{u(u - 1)}{2!}$ $\times$ $\Delta^2$ $g_a$  + ...............

where $u$ = $\frac{x-a}{h}$

First find the value of $u$ at $x$ = 8, $h$ = 1 and $a$ = 9

=> $u$ = $\frac{8-9}{1}$ = -1

Now, $g_8$ = 3 - 1 $\times$ - 0.162 + $\frac{(-1)(-2)}{2!}$ $\times$ - 0.316

= 3 + 0.162 - 0.316

= 2.84
In the theory of finite differences, there are different types of operators are used. 
Let us suppose that the equidistant values of the independent variables $x$ are: $a$, $a + h$, $a$ + 2$h$,....., $a + nh$.
where $h$ is common interval of difference and a is initial argument.

Let the entries of corresponding values of the independent variable, $y$ = $g(x)$ be $g(a)$, $g(a + h)$,...., $g(a + nh)$.

Forward difference operator, $\Delta$ :
The forward difference operator $\Delta$ for first order differences is defined as:
$\Delta$ $g(x)$ = $g (x + h)$ - $g (x)$ where $x$ = $a$, $a + h$, $a$ + 2$h$,.....We can get second order differences by performing the operator $\Delta$ on the first order differences:
$\Delta^2$ $g(a)$ = $\Delta$ ($\Delta$ $g(a)$) = $g(a + 2h)$ + $g(a)$ - 2 $g(a + h)$Backward difference operator, $\bigtriangledown $: The backward difference operator $\bigtriangledown $ is defined as:
$\bigtriangledown $  $g(x + h)$ = $g (x + h)$ - $g (x)$ = $\Delta$ $g(x)$=> The backward difference of $g(x + h)$ is same as forward difference of $g(x)$.

The stepping operator $E$ : In case of the arguments at equal intervals $h$, the operator $E$ is defined as:

$E g(x)$  = $g (x + h)$

In general: $E$$^s$ $g(x)$ = $g (x + sh)$

The central difference operator, $\delta$:
This operator can be defined by
First central difference operator: $\delta$ = $E^{\frac{1}{2}}$ - $E^{\frac{-1}{2}}$

Second central difference operator: $\delta$ = $(E^{\frac{1}{2}}$ - $E^{\frac{-1}{2}})^2$ = ($E + E^{-1}$ - 2)
The averaging operator, $\mu$: This can be defined as:
$\mu$ = $\frac{1}{2}$ ($E^{\frac{1}{2}}$ + $E^{\frac{-1}{2}}$)

Finite differences is a study of the relations that exist between the values assumed by the function whenever the independent variable changes by finite jumps whether equal or unequal.
Let us consider a function y = $s_x$ where $y$ is a dependent variable and $x$ is an independent variable. Suppose we are given with equidistant values, $b, b + h, b + 2h$, ...., of the variable $x$ at an interval of h. Then the corresponding values of the variable $y$ = $s_x$ are  $s_a$, $s_{b+h}$,  $s_{b+2h}$,......

Now we shall define the forward difference operator ($\Delta$), by definition we have
For first order differences
$\Delta$$s_x$ = $s_{x+h}$ - $s_x$
By substituting the values of $x$, we have

$\Delta$s$_b$ = s$_{b+h}$ - s$_b$
$\Delta$$s_{b+h}$ = $s_{b+2h}$ - $s_{b+h}$
and so on

Again for second order differences:
$\Delta^2$s$_x$ = $\Delta$($\Delta$$s_x$) = $\Delta$($s_{x+h}$ - $s_x$)

= (s$_{x+2h}$ - s$_{x+h}$) - ( s$_{x+h}$ - s$_x$)

= $\Delta$s$_{x+h}$ - $\Delta$s$_{x}$

or $\Delta^2$s$_b$ = $\Delta$s$_{b+h}$ - $\Delta$s$_{b}$

$\Delta^2$s$_{b+h}$ = $\Delta$s$_{b+2h}$ - $\Delta$s$_{b+h}$

and so on

Similarly we can obtain finite difference of higher order.

Forward finite difference table:

Finite Difference Table

Backward finite difference table

The backward differences usually denoted by $\bigtriangledown $, are defined as follows:
$\bigtriangledown $s$_{x+h}$ = s$_{x+h}$ - s$_{x}$ for x = b

Finite Difference Method  Table
A finite difference scheme is used for numerically solving the partial differential equations. While converting PDEs to finite difference schemes we need to assume constant step-size, i.e. sampling rates along space and time will be constant. The finite difference schemes are constructed by a term-by-term grid approximation of differential equations.  Finite difference schemes are helpful for valuation of derivatives when closed-form analytical solutions do not exist or for solutions to complicated multidimensional models. By discretizing the continuous time partial differential equation that the derivative security must follow, it is possible to approximate the evolution of the derivative and therefore the present value of the security.
Below are some examples based on finite difference:

Example 1:
 Given $S_1$ = 10, $S_2$ = 21 and $S_3$ = 35, find $\Delta^2$S$_x$

Solution:
The difference table is given below:

Finite Difference Example

Hence $\Delta^2$S$_x$ = 3.

Example 2:
 If S = 5$x^3$ + 3$x$ + 2, calculate the values of S corresponding to x = 0, 1, 2, 3, 4, 5, 6.

Solution
: We have S = 5$x^3$ + 3$x$ + 2

Putting $x$ = 0, 1, 2, 3, 4, 5 and 6 in given equation, we get

$S_0$ = 5$x^3$ + 3$x$ + 2 = 5 $\times$ 0 + 3 $\times$ 0 + 2 = 2

$S_1$ = 5$x^3$ + 3$x$ + 2 = 5 $\times$ 1 + 3 $\times$ 1 + 2 = 10

$S_2$ = 5$x^3$ + 3$x$ + 2 = 5 $\times$ 8 + 3 $\times$ 2 + 2 = 48

$S_3$ = 5$x^3$ + 3$x$ + 2 = 5 $\times$ 27 + 3 $\times$ 3 + 2 = 146

$S_4$ = 5$x^3$ + 3$x$ + 2 = 5 $\times$ 64 + 3 $\times$ 4 + 2 = 334

$S_5$ = 5$x^3$ + 3$x$ + 2 = 5 $\times$ 125 + 3 $\times$ 5 + 2 = 642

$S_6$ = 5$x^3$ + 3$x$ + 2 = 5 $\times$ 216 + 3 $\times$ 6 + 2 = 1100
Finite difference approximations depends on which combination of schemes we use in the equation discretizing, we have implicit, explicit or crank Nicolson methods. While dealing with FDA we also need to discretize the boundary conditions. Lets define finite difference method:

In $\frac{\partial g}{\partial t}$ + $\frac{\partial g}{\partial s}$ . rs + $\frac{\partial^2 g}{\partial s^2}$ . $\frac{\sigma^2s^2}{2}$ = r g, we have

Central difference -
$\frac{\partial g}{\partial s}$ $\approx$ $\frac{g_{i,j+1} - g_{i,j-1}}{2 \Delta s}$

Forward difference -
$\frac{\partial g}{\partial t}$ $\approx$ $\frac{g_{i+1,j} - g_{i,j}}{ \Delta t}$

and $\frac{\partial^2 g}{\partial s^2}$ $\approx$ $\frac{g_{i,j+1} + g_{i,j-1}-2g_{i,j}}{ \Delta s^2}$ , g r = r g$_{i,j}$

Rewriting above equations, we get an implicit equation, for $i$ = $N$ - 1, $N$ - 2, ..., 1, 0 and $j$ = 1, 2, 3, ...., $M$ - 1:

$m_j g_{i,j-1} + n_j g_{i,j} + o_j g_{i,j+1} = g_{i+1,j}$

where, $m_j$ = $\frac{\Delta t}{2}$ $(-j^2  \sigma^2 + jr)$

$n_j$ = 1 + $\Delta$ $t(-j^2  \sigma^2 + r)$

$o_j$ = $\frac{- \Delta t}{2}$ $(-j^2  \sigma^2 + jr)$
Finite-difference frequency-domain method is to study sub-wavelength lensing effects in left-handed materials. The solution of the problems usually based on electromagnetism and acoustics, based on FDA of the derivative operators in the differential equation.
In electromegnetism, problems finite difference frequency domain categorized in two ways:

1) With a constant frequency find the response to a current density.
2) In the absence of sources, find the normal modes of a structure.
Finite difference method is widely applicable not only in mathematics but also in various fields. Few important applications are listed below:

1) The finite differences is used in numerical analysis to solve partial and ordinary differential equations of numerical differential equations.

2)  An important application of finite difference method is in computational science and engineering disciplines, such as thermal engineering, fluid mechanics, etc.