Test solvers on 3D Poisson equation
This test is located in Tests/solver_test002
###Brief
Test solvers and tune parameters for on-the-fly generated matrices for 3D Poisson equation.
##Description
This test will run solvers in both serial and parallel modes with NP
processes for a model problem of 3D Poisson equation.
The coefficient matrix and vectors for a parallel run will be generated
directly at the target processor, so no external reordering is required
to be installed.
The artificial right-hand side rhs=(1,1,...,1) is used.
The specific solver is defined by a user.
User may also provide options file to alter default solver options.
Main purpose of this test is to assess robustness of internal or external solvers during development. Another purpose is to check the behaviour of the liner solver for large and extremely large test problems without taking into account the disk memory requirements.
##Arguments
Usage: ./solver_test002 method_number<0:INNER_ILU2,1:INNER_MLILUC,2:PETSc,3:Trilinos_Aztec,4:Trilinos_Belos,5:Trilinos_Ifpack,6:Trilinos_ML,7:ANI> N<for NxNxN problem> [solver_options.txt]
- First parameter is the Solver type:
- 0 –
INNER_ILU2
, inner Solver based on BiCGStab(L) solver with second order ILU factorization as preconditioner; - 1 –
INNER_MLILUC
, inner Solver based on BiCGStab(L) solver with second order Crout-ILU with inversed-based condition estimation and unsymmetric reordering for diagonal dominance as preconditioner; - 2 –
PETSc
, external Solver AztecOO from Trilinos package; - 3 –
Trilinos_Aztec
, external Solver Belos from Trilinos package, currently without preconditioner; - 4 –
Trilinos_Belos
, external Solver AztecOO with ML preconditioner; - 5 –
Trilinos_Ifpack
, external Solver AztecOO with Ifpack preconditioner; - 6 –
Trilinos_ML
, external Solver PETSc; - 7 –
ANI
, external Solver from ANI3D based on ILU2 (sequential Fortran version).
- 0 –
- Second parameter is the dimension N of the 3D Poisson problem for NxNxN mesh.
- Third optional parameter is the file with solver parameters, see
examples/MatSolve/database.txt
as example.
##Running test
You can run the test directly from the command line. For example, you can specify the 100x100x100 test case and solve it by the internal ILU2 based solver with the default parameters on 4 processors:
$ cd tests/solver_test002
$ mpirun -np 4 ./solver_test002 0 100
##CMake tests
This test will generate solver_test002_serial_*
tests for all activated solvers in CMake.
Those tests will check that solvers correctly get input and correctly output solution with 20x20x20 mesh.
If USE_MPI
is activated and CMake have variable ${MPIEXEC}
set up correctly then the test will seed
solver_test002_parallel_*
tests. These tests will check that solvers get input in parallel and correctly
output solution with 20x20x20 mesh using 4 processes.
##Source
Source code is adopted from examples/MatSolve