Skip to content

Commit

Permalink
Merge pull request #5 from ArneSpang/master
Browse files Browse the repository at this point in the history
Updated Petsc version in documentation
  • Loading branch information
boriskaus authored Sep 18, 2023
2 parents f98bcf5 + 187bbe8 commit 795026c
Show file tree
Hide file tree
Showing 2 changed files with 20 additions and 20 deletions.
8 changes: 4 additions & 4 deletions docs/src/man/Debugging.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ There is quite some discussion within the development team, what the best enviro
Here I will assume that PETSc, MPICH and LaMEM are installed in
```
$ /opt/mpich3/include
$ /opt/petsc/petsc-3.16.4-deb/include/
$ /opt/petsc/petsc-3.18.6-deb/include/
$ /local/home/boris/LaMEM/
```
which you obviously have to update for your system.
Expand All @@ -27,7 +27,7 @@ The first thing to do is to add a file called `c_cpp_properties.json` inside the
"includePath": [
"${workspaceFolder}/**",
"/opt/homebrew/include",
"/opt/petsc/petsc-3.16.4-deb/include/"
"/opt/petsc/petsc-3.18.6-deb/include/"
]
}
],
Expand All @@ -54,7 +54,7 @@ Once that is the case, you need to create a file called ``launch.json`` in the s
"args": ["-ParamFile","FallingBlock_IterativeSolver.dat","-nstep_max","2"],
"stopOnEntry": false,
"cwd": "/Users/kausb/WORK/LaMEM/LaMEM/input_models/BuildInSetups/",
"env": {"PETSC_DEB": "/opt/petsc/petsc-3.16.4-deb",
"env": {"PETSC_DEB": "/opt/petsc/petsc-3.18.6-deb",
"PATH": "/opt/homebrew/bin:${env:PATH}"},
"preLaunchTask": "C/C++: build LaMEM deb file",
},
Expand All @@ -76,7 +76,7 @@ For this, you need to create a file `tasks.json` in `.vscode`:
],
"options": {
"cwd": "/Users/kausb/WORK/LaMEM/LaMEM/src",
"env": {"PETSC_DEB": "/opt/petsc/petsc-3.16.4-deb/",
"env": {"PETSC_DEB": "/opt/petsc/petsc-3.18.6-deb/",
"PATH": "/opt/homebrew/bin:${env:PATH}"}
},
"problemMatcher": [
Expand Down
32 changes: 16 additions & 16 deletions docs/src/man/Installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ $ spack info petsc
```
Install PETSc with the correct packages, and leave out stuff we don't need. The optimized compilation of PETSc is installed with
```
$ spack install petsc@3.16.4 +mumps +suite-sparse -hypre -hdf5 -shared -debug
$ spack install petsc@3.18.6 +mumps +suite-sparse -hypre -hdf5 -shared -debug
```
If that works out, you'll have to update your environmental variables and create the ``PETSC_OPT`` variable
```
Expand Down Expand Up @@ -133,16 +133,16 @@ $ sudo port install gfortran gcc git cmake
The most important package for LaMEM is PETSc. If you just want to give LaMEM a try, the most basic installation is sufficient. Once you do production runs, it is worthwhile to experiment a bit with more optimized solver options. Installing PETSc with those does not always work, but PETSc has a very responsive user list which is searchable, and where you can post your questions if needed.
As PETSc regularly changes its syntac, LaMEM is always only compatible with a particular version of PETSc. This is typically updated once per year.

The current version of LaMEM is compatible with **PETSc 3.16.4**
The current version of LaMEM is compatible with **PETSc 3.18.6**

You can download the PETSc version you need [here](http://www.mcs.anl.gov/petsc/download/index.html). Do that and unzip it with
```
$ tar -xvf petsc-3.16.4.tar.gz
$ tar -xvf petsc-3.18.6.tar.gz
```

Change to the PETSc directory from the command window, for example with:
```
$ cd ~/Software/PETSc/petsc-3.16.4
$ cd ~/Software/PETSc/petsc-3.18.6
```
and specify the PETSC environmental variable:
```
Expand All @@ -152,14 +152,14 @@ $ export PETSC_DIR=$PWD
The simplest installation of PETSc can be configured as follows (assuming you are in the PETSc directory). This will automatically download and install the MPI library as well, together with a few other packages we will use.

```
$ ./config/configure.py --prefix=/opt/petsc/petsc-3.16.4-opt --download-mpich=1 --download-superlu_dist=1 --download-mumps=1 --download-scalapack=1 --download-fblaslapack=1 --with-debugging=0 --FOPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --COPTFLAGS=-O3 --with-shared-libraries=0 --download-cmake
$ ./config/configure.py --prefix=/opt/petsc/petsc-3.18.6-opt --download-mpich=1 --download-superlu_dist=1 --download-mumps=1 --download-scalapack=1 --download-fblaslapack=1 --with-debugging=0 --FOPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --COPTFLAGS=-O3 --with-shared-libraries=0 --download-cmake
```
This will install an optimized (fast) version of PETSc on your system in the directory `opt/petsc/petsc-3.16.4-opt`. You can change this directory, obviously, but in that case please remember where you put it as we need it later.
This will install an optimized (fast) version of PETSc on your system in the directory `opt/petsc/petsc-3.18.6-opt`. You can change this directory, obviously, but in that case please remember where you put it as we need it later.

If you want to have more control over PETSc and use the MPI version that you installed earlier on your system, using the package manager (see above), you can install it as:

```
$ ./config/configure.py --prefix=/opt/petsc/petsc-3.16.4-opt --download-superlu_dist=1 --doCleanup=1 --download-mumps=1 --download-suitesparse=1 --download-scalapack=1 --download-fblaslapack=1 --FOPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --COPTFLAGS=-O3 --with-shared-libraries=0 --download-cmake --with-debugging=0 --with-mpi-include=/opt/local/include/mpich-gcc7/ --with-cc=/opt/local/bin/mpicc --with-cxx=/opt/local/bin/mpicxx --with-fc=/opt/local/bin/mpif90 --with-mpi-lib=/opt/local/lib/mpich-gcc7/libmpi.a
$ ./config/configure.py --prefix=/opt/petsc/petsc-3.18.6-opt --download-superlu_dist=1 --doCleanup=1 --download-mumps=1 --download-suitesparse=1 --download-scalapack=1 --download-fblaslapack=1 --FOPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --COPTFLAGS=-O3 --with-shared-libraries=0 --download-cmake --with-debugging=0 --with-mpi-include=/opt/local/include/mpich-gcc7/ --with-cc=/opt/local/bin/mpicc --with-cxx=/opt/local/bin/mpicxx --with-fc=/opt/local/bin/mpif90 --with-mpi-lib=/opt/local/lib/mpich-gcc7/libmpi.a
```
Note that the above lines assume that mpi is installed under the directory `/opt/local/bin/`.
You can check that this is the case for you as well by typing
Expand All @@ -174,34 +174,34 @@ After the configuration step has finished succesfully (which will take some time

Next, make PETSc with:
```
$ make PETSC_DIR=/Users/kausb/Software/PETSC/petsc-3.16.4 PETSC_ARCH=arch-darwin-c-opt all
$ make PETSC_DIR=/Users/kausb/Software/PETSC/petsc-3.18.6 PETSC_ARCH=arch-darwin-c-opt all
```
After that, you will be asked to install PETSc

```
sudo make PETSC_DIR=/Users/kausb/Software/PETSC/petsc-3.16.4 PETSC_ARCH=arch-darwin-c-opt install
sudo make PETSC_DIR=/Users/kausb/Software/PETSC/petsc-3.18.6 PETSC_ARCH=arch-darwin-c-opt install
```

and test whether the installation works with
```
$ make PETSC_DIR=/opt/petsc/petsc-3.16.4-opt PETSC_ARCH="" check
$ make PETSC_DIR=/opt/petsc/petsc-3.18.6-opt PETSC_ARCH="" check
```
This will run a few test cases and if all is well, will tell you so.

If you only run simulations with LaMEM, the optimized version of PETSc described above will be sufficient. Yet, if you also develop routines and have to do debugging, it is a good idea to also install the debug version:

```
$ ./config/configure.py --prefix=/opt/petsc/petsc-3.16.4-deb --download-superlu_dist=1 --doCleanup=1 --download-mumps=1 --download-suitesparse=1 --download-scalapack=1 --download-fblaslapack=1 --FOPTFLAGS="-O0 -g" --CXXOPTFLAGS="-O0 -g" --COPTFLAGS="-O0 -g" --with-shared-libraries=0 --download-cmake --with-debugging=1 --with-mpi-include=/opt/local/include/mpich-gcc7/ --with-cc=/opt/local/bin/mpicc --with-cxx=/opt/local/bin/mpicxx --with-fc=/opt/local/bin/mpif90 --with-mpi-lib=/opt/local/lib/mpich-gcc7/libmpi.a
$ ./config/configure.py --prefix=/opt/petsc/petsc-3.18.6-deb --download-superlu_dist=1 --doCleanup=1 --download-mumps=1 --download-suitesparse=1 --download-scalapack=1 --download-fblaslapack=1 --FOPTFLAGS="-O0 -g" --CXXOPTFLAGS="-O0 -g" --COPTFLAGS="-O0 -g" --with-shared-libraries=0 --download-cmake --with-debugging=1 --with-mpi-include=/opt/local/include/mpich-gcc7/ --with-cc=/opt/local/bin/mpicc --with-cxx=/opt/local/bin/mpicxx --with-fc=/opt/local/bin/mpif90 --with-mpi-lib=/opt/local/lib/mpich-gcc7/libmpi.a
```
Compared to before, we have three changes, namely:

1) That the prefix (or the directory where PETSc will be put) is changed to `--prefix=/opt/petsc/petsc-3.16.4-deb`
1) That the prefix (or the directory where PETSc will be put) is changed to `--prefix=/opt/petsc/petsc-3.18.6-deb`
2) We tell it to compile a debug version of PETSc with `--with-debugging=1`
3) We change the optimization flags to `--FOPTFLAGS="-O0 -g" --CXXOPTFLAGS="-O0 -g" --COPTFLAGS="-O0 -g"`

With this you can repeat the procedure above. Just for completion, the simple configute option of above in debug mode would thus be:
```
$ ./config/configure.py --prefix=/opt/petsc/petsc-3.16.4-deb --download-mpich=1 --download-superlu_dist=1 --download-mumps=1 --download-scalapack=1 --download-fblaslapack=1 --download-cmake --with-debugging=1 --FOPTFLAGS="-O0 -g" --CXXOPTFLAGS="-O0 -g" --COPTFLAGS="-O0 -g" --with-shared-libraries=0
$ ./config/configure.py --prefix=/opt/petsc/petsc-3.18.6-deb --download-mpich=1 --download-superlu_dist=1 --download-mumps=1 --download-scalapack=1 --download-fblaslapack=1 --download-cmake --with-debugging=1 --FOPTFLAGS="-O0 -g" --CXXOPTFLAGS="-O0 -g" --COPTFLAGS="-O0 -g" --with-shared-libraries=0
```

## 1.1.4 Installing PETSc on a cluster
Expand All @@ -214,7 +214,7 @@ If you are lucky, a previous version of PETSc exists already on the cluster and
2) Run it, while adding the command-line option ```-log_view```
3) At the end of the simulation, it will show you the command-line options that were used to compile PETSc. These can be long; for us it was:
```
Configure options: --prefix=/cluster/easybuild/broadwell/software/numlib/PETSc/3.16.4-intel-2018.02-downloaded-deps --with-mkl_pardiso=1 --with-mkl_pardiso-dir=/cluster/easybuild/broadwell/software/numlib/imkl/2018.2.199-iimpi-2018.02-GCC-6.3.0/mkl --with-hdf5=1 --with-hdf5-dir=/cluster/easybuild/broadwell/software/data/HDF5/1.8.20-intel-2018.02 --with-large-io=1 --with-c++-support=1 --with-debugging=0 --download-hypre=1 --download-triangle=1 --download-ptscotch=1 --download-pastix=1 --download-ml=1 --download-superlu=1 --download-metis=1 --download-superlu_dist=1 --download-prometheus=1 --download-mumps=1 --download-parmetis=1 --download-suitesparse=1 --download-hypre-shared=0 --download-metis-shared=0 --download-ml-shared=0 --download-mumps-shared=0 --download-parmetis-shared=0 --download-pastix-shared=0 --download-prometheus-shared=0 --download-ptscotch-shared=0 --download-suitesparse-shared=0 --download-superlu-shared=0 --download-superlu_dist-shared=0 --with-cc=mpiicc --with-cxx=mpiicpc --with-c++-support --with-fc=mpiifort --CFLAGS="-O3 -xCORE-AVX2 -ftz -fp-speculation=safe -fp-model source -fPIC" --CXXFLAGS="-O3 -xCORE-AVX2 -ftz -fp-speculation=safe -fp-model source -fPIC" --FFLAGS="-O2 -xCORE-AVX2 -ftz -fp-speculation=safe -fp-model source -fPIC" --with-gnu-compilers=0 --with-mpi=1 --with-build-step-np=4 --with-shared-libraries=1 --with-debugging=0 --with-pic=1 --with-x=0 --with-windows-graphics=0 --with-fftw=1 --with-fftw-include=/cluster/easybuild/broadwell/software/numlib/imkl/2018.2.199-iimpi-2018.02-GCC-6.3.0/mkl/include/fftw --with-fftw-lib="[/cluster/easybuild/broadwell/software/numlib/imkl/2018.2.199-iimpi-2018.02-GCC-6.3.0/mkl/lib/intel64/libfftw3xc_intel_pic.a,libfftw3x_cdft_lp64_pic.a,libmkl_cdft_core.a,libmkl_blacs_intelmpi_lp64.a,libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a]" --with-scalapack=1 --with-scalapack-include=/cluster/easybuild/broadwell/software/numlib/imkl/2018.2.199-iimpi-2018.02-GCC-6.3.0/mkl/include --with-scalapack-lib="[/cluster/easybuild/broadwell/software/numlib/imkl/2018.2.199-iimpi-2018.02-GCC-6.3.0/mkl/lib/intel64/libmkl_scalapack_lp64.a,libmkl_blacs_intelmpi_lp64.a,libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a]" --with-blaslapack-lib="[/cluster/easybuild/broadwell/software/numlib/imkl/2018.2.199-iimpi-2018.02-GCC-6.3.0/mkl/lib/intel64/libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a]" --with-hdf5=1 --with-hdf5-dir=/cluster/easybuild/broadwell/software/data/HDF5/1.8.20-intel-2018.02
Configure options: --prefix=/cluster/easybuild/broadwell/software/numlib/PETSc/3.18.6-intel-2018.02-downloaded-deps --with-mkl_pardiso=1 --with-mkl_pardiso-dir=/cluster/easybuild/broadwell/software/numlib/imkl/2018.2.199-iimpi-2018.02-GCC-6.3.0/mkl --with-hdf5=1 --with-hdf5-dir=/cluster/easybuild/broadwell/software/data/HDF5/1.8.20-intel-2018.02 --with-large-io=1 --with-c++-support=1 --with-debugging=0 --download-hypre=1 --download-triangle=1 --download-ptscotch=1 --download-pastix=1 --download-ml=1 --download-superlu=1 --download-metis=1 --download-superlu_dist=1 --download-prometheus=1 --download-mumps=1 --download-parmetis=1 --download-suitesparse=1 --download-hypre-shared=0 --download-metis-shared=0 --download-ml-shared=0 --download-mumps-shared=0 --download-parmetis-shared=0 --download-pastix-shared=0 --download-prometheus-shared=0 --download-ptscotch-shared=0 --download-suitesparse-shared=0 --download-superlu-shared=0 --download-superlu_dist-shared=0 --with-cc=mpiicc --with-cxx=mpiicpc --with-c++-support --with-fc=mpiifort --CFLAGS="-O3 -xCORE-AVX2 -ftz -fp-speculation=safe -fp-model source -fPIC" --CXXFLAGS="-O3 -xCORE-AVX2 -ftz -fp-speculation=safe -fp-model source -fPIC" --FFLAGS="-O2 -xCORE-AVX2 -ftz -fp-speculation=safe -fp-model source -fPIC" --with-gnu-compilers=0 --with-mpi=1 --with-build-step-np=4 --with-shared-libraries=1 --with-debugging=0 --with-pic=1 --with-x=0 --with-windows-graphics=0 --with-fftw=1 --with-fftw-include=/cluster/easybuild/broadwell/software/numlib/imkl/2018.2.199-iimpi-2018.02-GCC-6.3.0/mkl/include/fftw --with-fftw-lib="[/cluster/easybuild/broadwell/software/numlib/imkl/2018.2.199-iimpi-2018.02-GCC-6.3.0/mkl/lib/intel64/libfftw3xc_intel_pic.a,libfftw3x_cdft_lp64_pic.a,libmkl_cdft_core.a,libmkl_blacs_intelmpi_lp64.a,libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a]" --with-scalapack=1 --with-scalapack-include=/cluster/easybuild/broadwell/software/numlib/imkl/2018.2.199-iimpi-2018.02-GCC-6.3.0/mkl/include --with-scalapack-lib="[/cluster/easybuild/broadwell/software/numlib/imkl/2018.2.199-iimpi-2018.02-GCC-6.3.0/mkl/lib/intel64/libmkl_scalapack_lp64.a,libmkl_blacs_intelmpi_lp64.a,libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a]" --with-blaslapack-lib="[/cluster/easybuild/broadwell/software/numlib/imkl/2018.2.199-iimpi-2018.02-GCC-6.3.0/mkl/lib/intel64/libmkl_intel_lp64.a,libmkl_sequential.a,libmkl_core.a]" --with-hdf5=1 --with-hdf5-dir=/cluster/easybuild/broadwell/software/data/HDF5/1.8.20-intel-2018.02
```
4) Use the same options for your latest installation, while adding config options you may need.

Expand All @@ -228,8 +228,8 @@ git clone https://bitbucket.org/bkaus/lamem.git ./LaMEM
```
Next you need to specify the environmental variables ```PETSC_OPT``` and ```PETSC_DEB```:
```
export PETSC_OPT=/opt/petsc/petsc-3.16.4-opt
export PETSC_DEB=/opt/petsc/petsc-3.16.4-deb
export PETSC_OPT=/opt/petsc/petsc-3.18.6-opt
export PETSC_DEB=/opt/petsc/petsc-3.18.6-deb
```
Note that this may need to be adapted, depending on the machine you use.
You may also want to specify this in your ```.bashrc``` files.
Expand Down

0 comments on commit 795026c

Please sign in to comment.