Prepare The Image


Deblur the Image with Conjugate Gradient While Collecting Data


Negative Energy Norm -> This Should not work at all

The Energy Norm of the error is not monotonically decreasing, meanin that the matrix is not Positive Definite. Conjugate gradient won't garantee convergence in this case.

But in this case, it worked for some reasons. But in general, it should not work, because if the energy norm of the transformation is negative, it implies that the linear transformation is not positive semidefinite (Which means that it could be a saddle parabolic for the objective function, which doesn't have minimum). But I am confident that the matrix is symmetric IF the kernel size is odd.

Successive Blurring

Blur the same image, but 2 times.

this time around, $A^2 = A^TA $, $A^TA$ will always positive semi-definite (Positive definite if the boundary conditions are Dirichelet for the 2d convolution ("fill" with zeros for example)). Whenever we blur the image even number of times, the CG solver will be able to minimize the energy norm of the error vector properly.

Yes, if you try N = 3, then we will have negative energy norm again.

Observe the smooth decay of the energy norm.


When thing fails?

Symmetric Matrix Representation and Double Blurring Exploit

$$$$\begin{aligned} Ax &= b \\ A^TAx &= A^Tb \\ A^2x &= Ab \end{aligned}$$$$

Instead of solving the first problem directly, we make use of the fact that $A$ is symmetric (Odd sized kernel), and then interpret $A$ as double blurring.

Now we have a deblurr algorithm that works for all odd kernel size.

We have another problem, the matrix is just too porly conditioned at this point, and the recovered image satisfied the equation but it's impossible to get back the original because the blurring is too aggressive.