Creates a sparse parallel matrix in the MATMPIROWBS format. This format is intended primarily as an interface for BlockSolve95.


PetscErrorCode PETSCMAT_DLLEXPORT MatCreateMPIRowbs(MPI_Comm comm,int m,int M,int nz,const int nnz[],Mat *newA)
Collective on MPI_Comm

Input Parameters

comm - MPI communicator
m - number of local rows (or PETSC_DECIDE to have calculated)
M - number of global rows (or PETSC_DECIDE to have calculated)
nz - number of nonzeros per row (same for all local rows)
nnz - number of nonzeros per row (possibly different for each row).

Output Parameter

newA -the matrix


If PETSC_DECIDE or PETSC_DETERMINE is used for a particular argument on one processor than it must be used on all processors that share the object for that argument.

The user MUST specify either the local or global matrix dimensions (possibly both).

Specify the preallocated storage with either nz or nnz (not both). Set nz=PETSC_DEFAULT and nnz=PETSC_NULL for PETSc to control dynamic memory allocation.


By default, the matrix is assumed to be nonsymmetric; the user can take advantage of special optimizations for symmetric matrices by calling
BEFORE calling the routine MatAssemblyBegin().

Internally, the MATMPIROWBS format inserts zero elements to the matrix if necessary, so that nonsymmetric matrices are considered to be symmetric in terms of their sparsity structure; this format is required for use of the parallel communication routines within BlockSolve95. In particular, if the matrix element A[i,j] exists, then PETSc will internally allocate a 0 value for the element A[j,i] during MatAssemblyEnd() if the user has not already set a value for the matrix element A[j,i].

Options Database Keys

-mat_rowbs_no_inode -Do not use inodes.


matrix, row, symmetric, sparse, parallel, BlockSolve

See Also

MatCreate(), MatSetValues()

Index of all Mat routines
Table of Contents for all manual pages
Index of all manual pages