PETSc is a software package that provides lots of functionality for linear algebra, among other things. For example, it includes implementations of a variety of linear solvers, as well as various different sparse and dense matrix and vector formats. Of particular interest to deal.II is their ability to provide this functionality both on sequential and parallel (using MPI) computers.
deal.II has wrapper classes to the linear algebra parts of PETSc that provide almost the same interfaces as the built-in deal.II linear algebra classes. We use these interfaces for parallel computations based on MPI since the native deal.II linear algebra classes lack this ability. They are used, among other programs, in step-17, step-18 and step-40.
Note: The most recent version of PETSc that has been reported to be compatible with deal.II is version 3.17.1. If you use a later version than this and encounter problems, let us know. deal.II does not support versions of PETSc prior to 3.7.0.
When you compile and install PETSc, you need to set
to a path to PETSc and denoting the architecture for which PETSc is
PETSC_ARCH is in reality just a name you give to
your installation, it is a string you can choose however you like. The
point of it is that it allows you to have multiple possibly different
PETSc installations. A consequence of this is that you need to
cmake scripts know which
one of these installations you want it to use, i.e., you need to set the
PETSC_ARCH variable to the same value you used when you
installed PETSc. The same is true for
PETSC_DIR. You can
this via environment variables.
cmake will then also
recognize that PETSc shall be used, and enable the wrapper classes,
without you having to explicitly say that you want to use PETSc.
-DPETSC_ARCH=ARCH options for
can be used to override the values of
PETSC_ARCH or if these environment
variables are not set at all. If you do have a PETSc
installation and have set the
PETSC_ARCH environment variables but do not wish
deal.II to be configured for PETSc use, you
-DDEAL_II_WITH_PETSC=OFF as a flag
Note: deal.II can be installed with both PETSc and Trilinos and they do not usually get in their respective ways. There are, however, occasions where this is not true and this fundamentally comes from the fact that both of these packages are built from subpackages that are developed by independent groups. Unfortunately, some of these sub-packages can be configured to be part of both PETSc and Trilinos, and if you try to use deal.II with versions of PETSc and Trilinos that both contain a particular sub-package, little good will come of it in general. In particular, we have experienced this with the ML package that can serve as an algebraic multigrid method to both PETSc and Trilinos. If both of these packages are configured to use ML, then difficult to understand error messages at compile or link time are almost inevitable, and there is little the deal.II build system can do to prevent this. Thus, don't try to do that!
Installing PETSc correctly can be a bit of a challenge. To start, take a look at the PETSc installation instructions. We have found that the following steps generally appear to work where we simply unpack and build PETSc in its final location (i.e., we do not first build and then install it into a separate directory):
tar xvzf petsc-x-y-z.tar.gz cd petsc-x-y-z export PETSC_DIR=`pwd` export PETSC_ARCH=x86_64 # or any other identifying text for your machine ./config/configure.py --with-shared=1 --with-x=0 --with-mpi=1 --download-hypre=1 make
This automatically builds PETSc with both MPI and the algebraic
multigrid preconditioner package Hypre (which we use in step-40). If you
would like to use PETSc with MPI then we recommend that you install MPI
through your package manager instead of letting PETSc install it: put
another way, installing PETSc with the flag
often causes problems (such as linking errors or poor performance)
that may be avoided by using whatever your system provides instead.
Now let PETSc check his own sanity:
make testwill self-check the serial (and MPI) implementation of PETSc.
exportcommands into your
~/.cshrcfiles, with the first one replaced by something of the kind
By default, PETSc is compiled in "debug mode". You can switch this to "optimized mode" by adding the command line parameter
--with-debugging=0to the call of
./config/configure.pyabove. In some cases, this has made linear solvers run up to 30% faster. As with choosing between deal.II's debug and optimized modes, you should only use optimized PETSc builds once you have tested that your program runs well in debug mode.