Date: Mon, 27 Nov 2006 14:41:31 -0500
From: Greg Hammett
To: Jin Chen
Subject: [Fwd: [Fwd: Re: status of modified incomplete factorization investigation]]
Jin, Thanks for informing me of the various methods you've tried and are trying. Here's a pointer to
the "Modified ILU" MILU algorithm, which is essential for ILU to be useful for the Poisson equation.
From later emails with Keyes and a PETSC developer, I confirmed that PETSC does not implement MILU at
present. For problems where Multigrid works well, then the absence of MILU isn't important. But for
problems where multigrid doesn't work for some reason, MILU might be a significant improvement over ILU.
(For "easy" diagonally dominant systems, MILU and ILU are likely to be very similar. It's only for hard,
non-diagaonally-dominant systems where MILU can be a signficiant improvement over ILU.) If you can now
use multigrid on all of your hard problems and are no longer doing ILU anywhere, then this doesn't matter
for you. --Greg
-------- Original Message --------
Subject: [Fwd: Re: status of modified incomplete factorization investigation]
Date: Tue, 21 Nov 2006 12:08:09 -0500
From: Greg Hammett
To: Stephen C. Jardin
Steve, FYI, here's one of the email exchanges with Keyes on how simple ILU is
an ineffective preconditioner for Poisson problems, Gustafsson's Modified ILU
(MILU) is better, and that Matlab doesn't properly implement MILU. Gustafsson's
paper is posted at:
http://w3.pppl.gov/~hammett/hid/2006/Gustaffson78.pdf
I'll forward you another email or two on this topic.
--Greg
-------- Original Message --------
Subject: Re: status of modified incomplete factorization investigation
Date: Tue, 11 Apr 2006 07:14:36 -0400 (EDT)
From: David Keyes
To: Greg Hammett
CC: Mark Adams , David Keyes
References: <6D660B0F-A740-461D-9327-247E362E185C@pppl.gov>
<442DABB4.9090903@pppl.gov> <937971EF-7DB8-43C0-A8F1-CF8749183DB1@pppl.gov>
<44344442.3070509@pppl.gov>
<4435620E.6060305@pppl.gov>
<443B1FC6.1090608@pppl.gov>
Dear Greg,
As always, during this process, I am impressed with your thoroughness and
your dedication to getting to the bottom of your problem's numerics. If
the world of physicists had a higher percentage of you, we numerical
analysts would be on our toes and inspired to please all of the time.
Thanks for this latest installment of your literature review and
considerations.
I am grateful to Mark for filling the gaps in my own attention to this
issue while you were pursuing it hotly and I was bedecked with less
interesting things to do. It is interesting that MATLAB's MICCG does not
seem to deliver, and you have pinpointed for future reference an issue to
be investigated.
This does not close the book, only a chapter. We may return to MICC or we
may make you a convert of one of the many coefficient-adaptive forms of MG
that will, I hope, work for you. I sense a paper lurking...
All the best until our next meeting, which is never far ahead these days,
David
On Mon, 10 Apr 2006, Greg Hammett wrote:
> Mark and David,
>
> I am about to put this problem aside for a while to work on other things, but here is a status report.
>
> To clarify, I agree that Multigrid is the clear winner on the simple Poisson problem, as it converges
> in O(N) operations with O(1) iterations. But I am interested ultimately in more complicated problems
> for which I think Multigrid might not work, which is why I was looking for other alternatives. As
> you've told me, Multigrid is like a highly tuned sports car and works extremely well on some problems,
> but not so well on some others. One of the types of problems I'm interested in involves
> highly-anisotropic heat conduction (where the heat conduction is rapid along field lines). If I could
> coarsen the grid only along magnetic field lines, then I think multigrid would work. But the grid is
> not aligned to field lines, and coarsening the grid will average over directions in which the residual
> is not smooth and loose information about the directionality of the small-scale magnetic field and it
> seems to me that multigrid is not designed to work well in such a case. I'm also interested in cases
> where the matrices are not symmetric. This is why I was investigating preconditioned Newton-Krylov
> solvers (some of my problems are nonlinear as well, which is why I'm interested in the Newton part).
>
> Without preconditioning, GMRES didn't work well on simple Poisson and required O(N) iterations (which
> makes it no better than trying to solve a diffusion equation implicitly with a tiny time step). This
> problem was fixed by going to Allison Baker's Loose GMRES algorithm, which was able to reduce the
> number of iterations from O(N) down to O(N^{0.5}) (where N=N_x^2 for 2-D Poisson). This is consistent
> with the expected scaling that the number of iterations scales as O((lambda_max/lambda_min)^{0.5}),
> where lambda_max/lambda_min is the ratio of the max to min eigenvalues of the matrix. This
> sqrt(lambda_max/lambda_min) scaling is what one expects from theory for Loose GMRES or Dynamic
> Relaxation and is also the optimal scaling for Conjugate Gradients or for Chebyshev iteration. (But I
> like using Loose GMRES instead of CG instead because it can work for non-symmetric matrices as well,
> while working as well as CG on the symmetric test cases I did.)
>
> As the Gustafsson papers that you and David have pointed to me clearly show, a "modified incomplete
> factorization" preconditioner is able to further reduce the condition number of the preconditioned
> matrix from lambda_max/lambda_min to (lambda_max/lambda_min)**0.5, so that the number of iterations for
> preconditioned CG or preconditioned Loose-GMRES will drop to (lambda_max/lambda_min)^0.25, or
> O(N^{0.25}) for 2-D Poisson (this is the 2-D equivalent to the 3-D results David cited in his talk).
>
> I was hoping to get similarly good performance for the anisotropic conduction case as for the simple
> isotropic/Poisson problem. For problems sizes of my interest (N_x ~ 400), I was hoping that
> preconditioning would reduce the number of iterations from O(400-800) to O(20-30) without increasing
> the amount of work per iteration too much, which would be good enough for me.
>
> As Gustafsson's papers make clear, the "modified preconditioner" is necessary to reduce the number of
> iterations from O(N^0.5) to O(N^0.25). But when I try the "modification" switch in Matlab's incomplete
> Cholesky preconditioner, I don't see this expected scaling, instead it still scales as O(N^{0.5}). The
> conclusion is that either there is a bug in the Matlab coding, or there is no "bug" per se, but the
> version of a "modified incomplete factorization" algorithm that matlab implements is not quite the same
> as Gustafsson's "modified incomplete factorization" algorithm, and isn't actually able to get the same
> speed up.
>
> I haven't had time to fully decipher Gustafsson's papers, but his algorithm has a parameter "xsi"
> (given in Eq. 3.1 on on p.147 of the last Gustafsson paper you emailed, BIT 18, 142 (1978)). He says
> that xsi > 0 (it is presumably an order unity coefficient), and on p. 153 he says that numerical
> experiments show "that the number of iterations is almost independent of the parameter xsi in a fairly
> wide range". But the way Saad and Matlab describe "modified incomplete factorization", there is no
> mention of a xsi parameter. So perhaps there is some subtle difference between the two versions of
> modified incomplete factorization that accounts for these differences?
>
> I have to deal with other things for a couple of months, so I am putting this problem aside for now.
> May return to it some time... Thanks for your suggestions and pointers to info in the literature.
>
> --Greg
>
>
>
> Mark Adams wrote:
> > >
> > > Given that this modification is essential to obtain the favorable sqrt(N_x) scaling, I'm somewhat
> > > surprised PETSC doesn't seem to have it. Am I missing something?
> >
> >
> > Multigrid has even better scaling properties and the MIC theory is silent on problems that multigrid
> > theory is applicable to, from what I understand. If someone shows that MIC is a big win on a class
> > of problems that no other PETSc solver addresses effectively then PETSc will get motivated to add
> > it. But they are human and can only do so much.
> >
> > Mark
> > **********************************************************************
> > Mark Adams Ph.D. Columbia University
> > 289 Engineering Terrace MC 4701
> > New York NY 10027
> > adams@pppl.gov www.columbia.edu/~ma2325
> > voice: 212.854.4485 fax: 212.854.8257
> > **********************************************************************