Things Petsc can to


1. MatCreateMPISBAIJ won't work with PCBJACOBI

[1]PETS ERROR: MatILUFactorSymbolic() line 3397 in src/mat/interface/matrix.c
[1]PETSC ERROR:   No support for this operation for this object type!
[1]PETSC ERROR:   Matrix type seqsbaij  symbolic ILU!
[1]PETSC ERROR: PCSetUp_ILU() line 624 in src/sles/pc/impls/ilu/ilu.c
[1]PETSC ERROR: PCSetUp() line 783 in src/sles/pc/interface/precon.c
[1]PETSC ERROR: SLESSetUp() line 382 in src/sles/interface/sles.c
[1]PETSC ERROR: PCSetUpOnBlocks_BJacobi_Singleblock() line 609 in src/sles/pc/impls/bjacobi/bjacobi.c


2. PCBJacobiGetSubSLES won't work with PCJCOBI
[1]PETSC ERROR:   Cannot get subsolvers for this preconditioner!
[1]PETSC ERROR: main() line 347 in src/sles/examples/tutorials/ex75.c
3. on signal processor

petsc-maint@mcs.anl.gov

1) run a particular method with the -help option to print all the options
   available for that method. 
   

2) -pc_type lu -ksp_type preonly 
   a direct solver based on dense LU factorization

   > -ksp_type preonly
     Sets the ksp to be one which only applies a preconditioner.

   > -pc_type lu
     Sets the preconditioner to be one which upon setup LU factorization
     preconditioning matrix and upon application performs triangular solves.
 
   > The combination of these two creates a dense direct solver via LU
     factorization and triangular solves.


3) impossible for petsc to solve AX=B where B is multi-column vector
   A is nxn, b is nxr, r>=1

   when solving multiple linear systems of the same size with the same 
   method, several options are available:

   >To solve successive linear systems having the same preconditioner
      matrix but different rhs, call SELSSolve multiple times

   >To solve successive linear systems having different preconditioner
      matrix, call SLESSetoperator and SLESSolve


4) conditioner number

   -ksp_singmonitor

   or in the code call KSPSetComputeSingularValues(ksp,PETSC_TRUE); 
   before the solve and ierr = KSPComputeEigenvalues(ksp,n,r,c,&neig); 
   after the solve to get the approximations to the extreme eigenvalues.


5) "none" indicates that no preconiditioning is used: identity


6) eigenvalue of A (poissc.c)

   LANCZOS ====
   -ksp_compute_eigenvalues for preconditioned A
   -ksp_compute_eigenvalues -pc_type none for A
   -ksp_plot_eigenvalues for preconditioned A
   -ksp_plot_eigenvalues -pc_type none for A


   PC  pc;
   KSP ksp;
   int MAXIT = 200; /* increase this number for more accuracy, the more
                       ill conditioned the matrix the larger you need it. */
   SLESGetKSP(sles,&ksp);
   KSPSetType(ksp,.....)
   KSPSetTolerance(ksp,1.e-12,1.e-50,1000.,MAXIT);
   KSPGMRESSetRestart(ksp,MAXIT);  /* add for gmres */
   KSPSetComputeEigenvalues(ksp,PETSC_TRUE);
   SLESGetPC(sles,&pc);
   PCSetType(pc,PCNONE); /* otherwise you'll get the eigenvalues of the
                            preconditioned operator */
   SLESSolve(sles,....)
   KSPGetEigenvalues(.....)  /* ?:Get/Compute */


   LAPACK ====
   -ksp_compute_eigenvalues_explicitly for preconditioned A
   -ksp_compute_eigenvalues_explicitly -pc_type none for A
   -ksp_plot_eigenvalues_explicitly for preconditioned A
   -ksp_plot_eigenvalues_explicitly -pc_type none for A


   KSP                   ksp;
   int                   nrow, ncolumn;
   PetscReal             *eigenreal, *eigencomplex;
   ierr = SLESGetKSP(userSolver.psiInverter[whichPhiSection], &ksp);
   CHKERRQ(ierr);
   ierr = MatGetSize(Msolver[whichPhiSection],nrow,ncolumn); CHKERRQ(ierr);
   ierr = KSPComputeEigenvaluesExplicitly(ksp,nrow,eigenreal,eigencomplex); 
   CHKERRQ(ierr);
   DrawSP*();


7) KSPSetType(ksp,KSPCG) for the symmetric matrix
   KSPSetType(ksp,KSPGMRES) for the nonsymmetric. 

   KSPCGSetType(KSP ksp,KSPCGType type)
    |                |            |
    |                |            |
    |                |            +- the variant of CG to use, one of
    |                |   KSP_CG_HERMITIAN - complex, Hermitian matrix (default)
    |                |   KSP_CG_SYMMETRIC - complex, symmetric matrix
    |                |
    |                |
    |                +-- the iterative context
    |
    |
    +-- Sets the variant of the conjugate gradient method to use for solving 
        a linear system with a complex coefficient matrix. This option is 
        irrelevant when solving a real system. 


8) PETSc 2.1.3 supports the Hypre preconditioners 
   which include Euclid (a parallel ILU(k)),
                 PILUT (a parallel drop tolerance ILU) and 
                 BoomerAMG (a parallel algebraic multigrid). 

   installing PETSc with hypre and using -pc_type hypre -help to 
   find out the options.

   All the KSP are available in parallel. Except for SOR and Eisentat 
   there are varients of all the preconditioners that work in parallel.

9) options for petsc:
poe ./m3dp.x -retry 20 -retrycount 100 
-vmecfile vmec4.dat
-configFile config.dat.1 
-ksp_gmres_restart 120
-ksp_gmres_modifiedgramschmidt 
-sub_pc_ilu_levels 3