Deblurring Gaussian blur

Part of the Computational Imaging and Vision book series (CIVI, volume 27)

To discuss an application where really high order Gaussian derivatives are applied, we study the deblurring of Gaussian blur by inverting the action of the diffusion equation, as originally described by Florack et al. [Florack et al. 1994b, TerHaarRomeny 1994a].


Diffusion Equation Laplacian Operator High Order Derivative Nonlinear Diffusion Equation Blur Kernel 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science + Business Media B.V. 2003

Personalised recommendations