Creates a sparse parallel matrix in MPIBDiag format.


PetscErrorCode PETSCMAT_DLLEXPORT MatCreateMPIBDiag(MPI_Comm comm,PetscInt m,PetscInt M,PetscInt N,PetscInt nd,PetscInt bs,const PetscInt diag[],PetscScalar *diagv[],Mat *A)
Collective on MPI_Comm

Input Parameters

    diag = i/bs - j/bs  (integer division)
Set diag=PETSC_NULL on input for PETSc to dynamically allocate memory as needed (expensive).
comm - MPI communicator
m - number of local rows (or PETSC_DECIDE to have calculated if M is given)
M - number of global rows (or PETSC_DETERMINE to have calculated if m is given)
N - number of columns (local and global)
nd - number of block diagonals (global) (optional)
bs - each element of a diagonal is an bs x bs dense matrix
diag - optional array of block diagonal numbers (length nd). For a matrix element A[i,j], where i=row and j=column, the diagonal number is
diagv - pointer to actual diagonals (in same order as diag array), if allocated by user. Otherwise, set diagv=PETSC_NULL on input for PETSc to control memory allocation.

Output Parameter

A -the matrix

Options Database Keys

-mat_block_size <bs> -Sets blocksize
-mat_bdiag_diags <s1,s2,s3,...> -Sets diagonal numbers


If PETSC_DECIDE or PETSC_DETERMINE is used for a particular argument on one processor than it must be used on all processors that share the object for that argument.

The parallel matrix is partitioned across the processors by rows, where each local rectangular matrix is stored in the uniprocessor block diagonal format. See the users manual for further details.

The user MUST specify either the local or global numbers of rows (possibly both).

The case bs=1 (conventional diagonal storage) is implemented as a special case.

Fortran Notes

Fortran programmers cannot set diagv; this variable is ignored.


matrix, block, diagonal, parallel, sparse

See Also

MatCreate(), MatCreateSeqBDiag(), MatSetValues()

Index of all Mat routines
Table of Contents for all manual pages
Index of all manual pages