With this record you can set the Krylov Subspace type for the PETSc solver (so the method in which the iterative method searches for the solution of the equations). Possibilities are: -richardson, -chebychev, -cg, -bicg, -gmres, -bcgs, -tfqmr, -tcqmr, -cr, -lsqr, -preonly. See control_options_solver_petsc_pctype on how to set the preconditioning type.
The PETSc solver needs to be installed separately on your computer. See the Tochnog homepage links where to get PETSc. After installing PETSc, you need to compile Tochnog and link it to the PETSc library; follow the instructions in tochnog/src/makefile.
Running on single processor computers.
For running on a single processor computer run Tochnog like tochnog beam.dat.
Running on distributed processor systems (clusters of workstations).
With the PETSc solver you can also run on distributed memory systems. The MPI is used for message parsing. We recommend using the MPICH version of MPI; see the links at the homepage where to get it.
For running on distributed systems, run Tochnog like mpi -v -np 20 tochnog if you have the same filesystem across all the machines. Otherwise you need to use a procgroup file which tells where in each machine the executable and the data file are. Examples of usage are on page 9 of the mpich guide.ps file. If mpich is used for MPI, then the $MPIHOME/util/machines/machines.$ARCH file contains the hostnames. But if the procgroup file is used, then it takes precedence over machines.$ARCH.
A procgroup file is like:
hostname1 0 /home/xxx/tochnog
hostname2 1 /users/yyy/tochnog
Also notice that in calling Tochnog no arguments are specified so always the default input file tn.dat is used.