Diffusion-based filtering is a method used to smooth the image data without losing an important part of the image or information like edges or other high-frequency image features in order to remove image noise, or image artifacts. It was based on creating a succession of blurred image versions by resolving a diffusion process partial differential equation (PDE).
Diffusion resembles the process that creates a scale space, where an image generates a parameterized family of successively more and more blurred images based on the diffusion process. These images are produced by convolutioning the image with a 2D isotropic Gaussian filter, whose width rises as the parameter value does. The original image is transformed linearly and spatially independently by this diffusion process.
The diffusion process has a generalized version which is called ‘Anisotropic’ diffusion. It produces a suite of parameterized images, but each resulting image is a combination between the original image and a filter that depends on the local content of the original image. Thus, anisotropic diffusion is a ‘non-linear and space-variant’ transformation of the original image while the diffusion process itself is a ‘linear and space-invariant’ transformation of the original image.
Table of contents:
- Image filtering by the diffusion process
- The diffusion equation
- Nonlinear or Anisotropic Diffusion
- Edge-preserving diffusion
- Applications of diffusion-based image filtering
- Image filtering by the diffusion process
Convolutions are the classic example of image filtering using diffusion process. the convolution of an input image f(x) with a Kernel G(x) can be represented as below
gx=G*fx=∫ G(x-x^' )f(x^' )dx^'
G(x) is an example of a linear filter in above equation.
Convolutions can be done efficiently in this domain, reason being a convolution corresponds to a straightforward (frequency-wise) product in frequency space and the Fast Fourier Transform enables rapid conversions to and from frequency space.
In practice however, linear filters are often suboptimal. For instance, when smoothing(filtering) or denoising a signal, the Gaussian smoothing removes both the noise and thus the signal, causing any semantically meaningful structures to also vanish. Instead, one would prefer to eliminate noise in a way that protects semantically relevant structures. In theory, this might be achieved using a Gaussian smoothing where the filter width is customised to the local structure (larger in noise areas, smaller at important edges).
Above, filter equation can be derived as follow:
Where now the width σ of the convolution kernel G depends on the brightness values in a local neighbourhood.
It is likely that there are other more elegant solutions to model such an adaptive denoising process or non-linear filtering by means of diffusion filtering.
The major observation we need to make here is that image smoothing can be modelled with a diffusion process. In this process, the local brightness diffuses to neighbouring pixels due to local concentration differences.
This diffusion process to filter images mathematically are represented by partial differential equations(PDEs)
- The Diffusion Equation
Diffusion in essence is a physical process which aims at minimizing differences in the spatial concentration u(x,t) of a substance.
Diffusion equation can be derived from two basic equations of ‘Fick’s Law’ and ‘The continuity equation’.
Derivations of both equations would give us diffusion equation as follow:
- Non-linear or Anisotropic diffusion
We saw General diffusion equation as follow:
In this equation,
- For g=1(or g=const. ∈R) It is known as a linear, isotropic, homogeneous diffusion process.
- The process is known as an inhomogeneous diffusion if the diffusivity is space-dependent, i.e., g=g(x).
- It is referred to as a nonlinear diffusion since the equation is no longer linear in u if the diffusivity varies on u, i.e. g=g(u).
- An anisotropic diffusion process is one where the diffusivity g is matrix-valued. A matrix-valued diffusivity results in processes where the diffusivity is different in various directions.
- Edge-Preserving diffusion
Simple concept behind edge-preserving diffusion is that, less diffusion at locations of string edge information.
- Gradient norm |∇u|=u2x+u2y serves as an edge indicated in diffusion process
- Diffusivity should decrease with increasing |∇u| in such cases to preserve the edges
The Perona-Malik model (Anisotropic diffusion) had a huge impact in image processing because it allowed a much better edge detection than classical edge detectors.
- Applications of Diffusion-based Image filtering
- Digital photos with noise can be cleaned up using anisotropic diffusion without introducing edge blur.
- The heat equation, which is identical to Gaussian blurring, can be obtained by solving the anisotropic diffusion equations with a constant diffusion coefficient. This blurs edges randomly while also being excellent for eliminating noise. The resulting equations stimulate diffusion (therefore smoothing) inside regions of smoother picture intensity and suppress it across strong edges when the diffusion coefficient is selected as an edge-avoiding function, as in Perona-Malik. As a result, the image's edges are kept while noise is eliminated.
- Anisotropic diffusion can be employed in edge detection algorithms in a manner similar to noise removal. The image can grow towards a piecewise constant image by using an edge seeking diffusion coefficient for a predetermined number of repetitions, with the boundaries between the constant components being identified as edges.
In this blog, we saw diffusion-based image filtering using various methods such as linear filtering, non-linear or anisotropic smoothing, inhomogeneous diffusion process, and edge-preserving diffusion along with the equation of the general diffusion process.
We also learned some of the applications of diffusion-based image filtering in real-world scenarios.