Performance of ElmerSolver on different platforms

General discussion about Elmer
Post Reply
raback
Site Admin
Posts: 4812
Joined: 22 Aug 2009, 11:57
Antispam: Yes
Location: Espoo, Finland
Contact:

Performance of ElmerSolver on different platforms

Post by raback »

Hi All,

After the introduction of labels in ctest it is rather easy to run a simple performance test on Elmer. The tests with label 'benchmark' may be run with

Code: Select all

ctest -L benchmark 
This should give roughly the following ouput:

Code: Select all

elmeruser@elmeruser-VM64bit ~/Source/builddir $ ctest  -L benchmark
Test project /home/elmeruser/Source/builddir
      Start  62: LimitTemperature
 1/14 Test  #62: LimitTemperature .................   Passed   19.91 sec
      Start  63: mgdyn_steady
 2/14 Test  #63: mgdyn_steady .....................   Passed    8.31 sec
      Start  86: VectorHelmholtzWaveguide
 3/14 Test  #86: VectorHelmholtzWaveguide .........   Passed   11.06 sec
      Start 106: RichardsDyke2
 4/14 Test #106: RichardsDyke2 ....................   Passed    9.38 sec
      Start 138: diffuser_v2f
 5/14 Test #138: diffuser_v2f .....................   Passed   13.01 sec
      Start 186: FlowResNoslip
 6/14 Test #186: FlowResNoslip ....................   Passed    5.80 sec
      Start 224: ConstantBCDisplacement
 7/14 Test #224: ConstantBCDisplacement ...........   Passed   10.45 sec
      Start 232: structmap3
 8/14 Test #232: structmap3 .......................   Passed   16.44 sec
      Start 268: Step_sst-kw-wf
 9/14 Test #268: Step_sst-kw-wf ...................   Passed   20.11 sec
      Start 273: mgdyn_harmonic
10/14 Test #273: mgdyn_harmonic ...................   Passed   14.05 sec
      Start 275: diffuser_sa
11/14 Test #275: diffuser_sa ......................   Passed    7.41 sec
      Start 288: mgdyn_steady_periodic
12/14 Test #288: mgdyn_steady_periodic ............   Passed   19.65 sec
      Start 298: levelset3b
13/14 Test #298: levelset3b .......................   Passed    8.45 sec
      Start 306: RotatingBCPoisson3DGeneric
14/14 Test #306: RotatingBCPoisson3DGeneric .......   Passed   13.40 sec

100% tests passed, 0 tests failed out of 14

Label Time Summary:
benchmark    = 177.41 sec

Total Test time (real) = 177.77 sec
The tests can be used for mainly for two purposes
1) To rate the performance of different platforms for typical small Elmer cases
2) To ensure that the code does not become slower after new features are introduced

Because we cannot guarantee exact same performance between versions it may not really be useful to compare different platforms with different Elmer versions. Still we hope that this could be of use.

I encourage everybody to run the tests and specify their setup and code performance.

-Peter
raback
Site Admin
Posts: 4812
Joined: 22 Aug 2009, 11:57
Antispam: Yes
Location: Espoo, Finland
Contact:

Re: Performance of ElmerSolver on different platforms

Post by raback »

Performance on my laptop:

Time: 167.18 s (best out of three trials)
Version: 8.0 (Rev: 840293d, Compiled: 2015-05-20)
Hardware: CPU i7-3520, 2.9 GHz, 8 Gb, SSD
OS: Linux Mint 17 WM under Windows 7
Compiler: gfortran 4.8.4

-Peter
pavel
Posts: 34
Joined: 17 Apr 2014, 17:08
Antispam: Yes

Re: Performance of ElmerSolver on different platforms

Post by pavel »

Code: Select all

      Start   3: mgdyn_steady
 1/14 Test   #3: mgdyn_steady .....................   Passed    5.30 sec
      Start  19: ConstantBCDisplacement
 2/14 Test  #19: ConstantBCDisplacement ...........   Passed    3.71 sec
      Start  43: diffuser_sa
 3/14 Test  #43: diffuser_sa ......................   Passed    4.73 sec
      Start  64: RotatingBCPoisson3DGeneric
 4/14 Test  #64: RotatingBCPoisson3DGeneric .......   Passed    6.76 sec
      Start  90: levelset3b
 5/14 Test  #90: levelset3b .......................   Passed    3.97 sec
      Start  93: mgdyn_steady_periodic
 6/14 Test  #93: mgdyn_steady_periodic ............   Passed    6.65 sec
      Start  94: mgdyn_harmonic
 7/14 Test  #94: mgdyn_harmonic ...................   Passed    7.54 sec
      Start 117: FlowResNoslip
 8/14 Test #117: FlowResNoslip ....................   Passed    4.45 sec
      Start 129: LimitTemperature
 9/14 Test #129: LimitTemperature .................   Passed    8.71 sec
      Start 136: structmap3
10/14 Test #136: structmap3 .......................   Passed    6.95 sec
      Start 148: diffuser_v2f
11/14 Test #148: diffuser_v2f .....................   Passed    7.57 sec
      Start 191: RichardsDyke2
12/14 Test #191: RichardsDyke2 ....................   Passed    4.33 sec
      Start 211: Step_sst-kw-wf
13/14 Test #211: Step_sst-kw-wf ...................   Passed    8.95 sec
      Start 292: VectorHelmholtzWaveguide
14/14 Test #292: VectorHelmholtzWaveguide .........   Passed    4.16 sec

100% tests passed, 0 tests failed out of 14

Label Time Summary:
benchmark    =  83.77 sec

Total Test time (real) =  83.80 sec
ElmerSolver Version: 8.0 (Rev: 4ed7545, Compiled: 2015-05-20)
Hardware: Intel® Core™ i5-2500 CPU @ 3.30GHz × 4 , 8 Gb, HDD
OS: Ubuntu 14.04 LTS 64bit
gcc version 4.8.2 (Ubuntu 4.8.2-19ubuntu1)

- Pavel
mika
Posts: 230
Joined: 15 Sep 2009, 07:44

Re: Performance of ElmerSolver on different platforms

Post by mika »

benchmark = 81.95 sec

Version: 4ed7545 (May 21, 2015)
CPU Model name: Intel(R) Core(TM) i7-4600U CPU @ 2.10GHz
RHEL 7.1
GNU Fortran (GCC) 4.8.3 20140911 (Red Hat 4.8.3-9)

- Mika
joni
Posts: 22
Joined: 10 Mar 2018, 15:05
Antispam: Yes

Re: mpi4 - rdw.fi cluster

Post by joni »

Hi,

How to do test mpirun -n 3 / slurm,... ?

MPI4YOU-ubuntu 18.04 is under work and compatible libraryes at serach:
- mpich3.3
-amDLIBFALME1.3
At moment from ubuntu:
-mumps
-parmetis


Ubuntu Elmer, single AMD 9500 processor gives WHIT ONE CORE/THREAD:

Label Time Summary:
benchmark = 186.16 sec*proc (16 tests)
elasticsolve = 2.82 sec*proc (1 test)
matc = 16.48 sec*proc (2 tests)
serial = 186.16 sec*proc (16 tests)
threaded = 50.21 sec*proc (1 test)
whitney = 41.52 sec*proc (4 tests)

Total Test time (real) = 186.93 sec

Code: Select all

joni@mpi1:~$ ldd /usr/bin/ElmerSolver
	linux-vdso.so.1 (0x00007ffd8d3e3000)
	libelmersolver.so => /usr/bin/../lib/elmersolver/libelmersolver.so (0x00007fc60b648000)
	libgfortran.so.4 => /usr/lib/x86_64-linux-gnu/libgfortran.so.4 (0x00007fc60b268000)
	libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fc60ae70000)
	libmatc.so => /usr/bin/../lib/elmersolver/libmatc.so (0x00007fc60ac48000)
	libfhuti.so => /usr/bin/../lib/elmersolver/libfhuti.so (0x00007fc60aa10000)
	libarpack.so => /usr/bin/../lib/elmersolver/libarpack.so (0x00007fc60a7c0000)
	libopenblas.so.0 => /usr/lib/x86_64-linux-gnu/libopenblas.so.0 (0x00007fc608518000)
	libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fc608310000)
	libdmumps-5.1.2.so => /usr/lib/x86_64-linux-gnu/libdmumps-5.1.2.so (0x00007fc607fa0000)
	libHYPRE_parcsr_ls-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_parcsr_ls-2.13.0.so (0x00007fc607c90000)
	libHYPRE_IJ_mv-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_IJ_mv-2.13.0.so (0x00007fc607a78000)
	libparpack.so => /usr/bin/../lib/elmersolver/libparpack.so (0x00007fc607830000)
	libmpi_mpifh.so.20 => /usr/lib/x86_64-linux-gnu/libmpi_mpifh.so.20 (0x00007fc6075d8000)
	libmpi.so.20 => /usr/lib/x86_64-linux-gnu/libmpi.so.20 (0x00007fc6072e0000)
	libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fc606f40000)
	libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007fc606d28000)
	libquadmath.so.0 => /usr/lib/x86_64-linux-gnu/libquadmath.so.0 (0x00007fc606ae8000)
	/lib64/ld-linux-x86-64.so.2 (0x00007fc60bf40000)
	libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fc6068c8000)
	libmumps_common-5.1.2.so => /usr/lib/x86_64-linux-gnu/libmumps_common-5.1.2.so (0x00007fc606678000)
	libblas.so.3 => /usr/lib/x86_64-linux-gnu/libblas.so.3 (0x00007fc606418000)
	libscalapack-openmpi.so.2.0 => /usr/lib/x86_64-linux-gnu/libscalapack-openmpi.so.2.0 (0x00007fc605c40000)
	liblapack.so.3 => /usr/lib/x86_64-linux-gnu/liblapack.so.3 (0x00007fc6053b8000)
	libHYPRE_utilities-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_utilities-2.13.0.so (0x00007fc6051a8000)
	libHYPRE_multivector-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_multivector-2.13.0.so (0x00007fc604fa0000)
	libHYPRE_krylov-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_krylov-2.13.0.so (0x00007fc604d80000)
	libHYPRE_seq_mv-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_seq_mv-2.13.0.so (0x00007fc604b70000)
	libHYPRE_parcsr_mv-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_parcsr_mv-2.13.0.so (0x00007fc604940000)
	libHYPRE_parcsr_block_mv-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_parcsr_block_mv-2.13.0.so (0x00007fc604710000)
	libHYPRE_DistributedMatrix-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_DistributedMatrix-2.13.0.so (0x00007fc604508000)
	libHYPRE_MatrixMatrix-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_MatrixMatrix-2.13.0.so (0x00007fc604300000)
	libHYPRE_DistributedMatrixPilutSolver-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_DistributedMatrixPilutSolver-2.13.0.so (0x00007fc6040e8000)
	libHYPRE_ParaSails-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_ParaSails-2.13.0.so (0x00007fc603ed0000)
	libHYPRE_Euclid-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_Euclid-2.13.0.so (0x00007fc603c80000)
	libopen-pal.so.20 => /usr/lib/x86_64-linux-gnu/libopen-pal.so.20 (0x00007fc6039c8000)
	libopen-rte.so.20 => /usr/lib/x86_64-linux-gnu/libopen-rte.so.20 (0x00007fc603740000)
	librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007fc603538000)
	libhwloc.so.5 => /usr/lib/x86_64-linux-gnu/libhwloc.so.5 (0x00007fc6032f8000)
	libpord-5.1.2.so => /usr/lib/x86_64-linux-gnu/libpord-5.1.2.so (0x00007fc6030e0000)
	libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007fc602ed8000)
	libnuma.so.1 => /usr/lib/x86_64-linux-gnu/libnuma.so.1 (0x00007fc602cc8000)
	libltdl.so.7 => /usr/lib/x86_64-linux-gnu/libltdl.so.7 (0x00007fc602ab8000)
joni@mpi1:~$ ldd /usr/bin/ElmerSolver_mpi
	linux-vdso.so.1 (0x00007ffe0cd73000)
	libelmersolver.so => /usr/bin/../lib/elmersolver/libelmersolver.so (0x00007f8b9d448000)
	libgfortran.so.4 => /usr/lib/x86_64-linux-gnu/libgfortran.so.4 (0x00007f8b9d068000)
	libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f8b9cc70000)
	libmatc.so => /usr/bin/../lib/elmersolver/libmatc.so (0x00007f8b9ca48000)
	libfhuti.so => /usr/bin/../lib/elmersolver/libfhuti.so (0x00007f8b9c810000)
	libarpack.so => /usr/bin/../lib/elmersolver/libarpack.so (0x00007f8b9c5c0000)
	libopenblas.so.0 => /usr/lib/x86_64-linux-gnu/libopenblas.so.0 (0x00007f8b9a318000)
	libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f8b9a110000)
	libdmumps-5.1.2.so => /usr/lib/x86_64-linux-gnu/libdmumps-5.1.2.so (0x00007f8b99da0000)
	libHYPRE_parcsr_ls-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_parcsr_ls-2.13.0.so (0x00007f8b99a90000)
	libHYPRE_IJ_mv-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_IJ_mv-2.13.0.so (0x00007f8b99878000)
	libparpack.so => /usr/bin/../lib/elmersolver/libparpack.so (0x00007f8b99630000)
	libmpi_mpifh.so.20 => /usr/lib/x86_64-linux-gnu/libmpi_mpifh.so.20 (0x00007f8b993d8000)
	libmpi.so.20 => /usr/lib/x86_64-linux-gnu/libmpi.so.20 (0x00007f8b990e0000)
	libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f8b98d40000)
	libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f8b98b28000)
	libquadmath.so.0 => /usr/lib/x86_64-linux-gnu/libquadmath.so.0 (0x00007f8b988e8000)
	/lib64/ld-linux-x86-64.so.2 (0x00007f8b9dd40000)
	libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f8b986c8000)
	libmumps_common-5.1.2.so => /usr/lib/x86_64-linux-gnu/libmumps_common-5.1.2.so (0x00007f8b98478000)
	libblas.so.3 => /usr/lib/x86_64-linux-gnu/libblas.so.3 (0x00007f8b98218000)
	libscalapack-openmpi.so.2.0 => /usr/lib/x86_64-linux-gnu/libscalapack-openmpi.so.2.0 (0x00007f8b97a40000)
	liblapack.so.3 => /usr/lib/x86_64-linux-gnu/liblapack.so.3 (0x00007f8b971b8000)
	libHYPRE_utilities-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_utilities-2.13.0.so (0x00007f8b96fa8000)
	libHYPRE_multivector-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_multivector-2.13.0.so (0x00007f8b96da0000)
	libHYPRE_krylov-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_krylov-2.13.0.so (0x00007f8b96b80000)
	libHYPRE_seq_mv-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_seq_mv-2.13.0.so (0x00007f8b96970000)
	libHYPRE_parcsr_mv-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_parcsr_mv-2.13.0.so (0x00007f8b96740000)
	libHYPRE_parcsr_block_mv-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_parcsr_block_mv-2.13.0.so (0x00007f8b96510000)
	libHYPRE_DistributedMatrix-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_DistributedMatrix-2.13.0.so (0x00007f8b96308000)
	libHYPRE_MatrixMatrix-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_MatrixMatrix-2.13.0.so (0x00007f8b96100000)
	libHYPRE_DistributedMatrixPilutSolver-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_DistributedMatrixPilutSolver-2.13.0.so (0x00007f8b95ee8000)
	libHYPRE_ParaSails-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_ParaSails-2.13.0.so (0x00007f8b95cd0000)
	libHYPRE_Euclid-2.13.0.so => /usr/lib/x86_64-linux-gnu/libHYPRE_Euclid-2.13.0.so (0x00007f8b95a80000)
	libopen-pal.so.20 => /usr/lib/x86_64-linux-gnu/libopen-pal.so.20 (0x00007f8b957c8000)
	libopen-rte.so.20 => /usr/lib/x86_64-linux-gnu/libopen-rte.so.20 (0x00007f8b95540000)
	librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f8b95338000)
	libhwloc.so.5 => /usr/lib/x86_64-linux-gnu/libhwloc.so.5 (0x00007f8b950f8000)
	libpord-5.1.2.so => /usr/lib/x86_64-linux-gnu/libpord-5.1.2.so (0x00007f8b94ee0000)
	libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007f8b94cd8000)
	libnuma.so.1 => /usr/lib/x86_64-linux-gnu/libnuma.so.1 (0x00007f8b94ac8000)
	libltdl.so.7 => /usr/lib/x86_64-linux-gnu/libltdl.so.7 (0x00007f8b948b8000)


MPI4YOU Elmer, single AMD 9500 processor gives WHIT SEVEN CORE/THREAD:

joni@mpi1:/mpi/S5/builddir/ELMER-LINUX$ export OMP_NUM_THREADS=6
joni@mpi1:/mpi/S5/builddir/ELMER-LINUX$ ctest -L benchmark
Test project /mpi/S5/builddir/ELMER-LINUX
Start 59: ConstantBCDisplacement
1/16 Test #59: ConstantBCDisplacement ........... Passed 2.73 sec
Start 176: FlowResNoslip
2/16 Test #176: FlowResNoslip .................... Passed 6.92 sec
Start 179: H1BasisEvaluation
3/16 Test #179: H1BasisEvaluation ................ Passed 10.10 sec
Start 231: LimitTemperature
4/16 Test #231: LimitTemperature ................. Passed 14.15 sec
Start 233: LinearFormsAssembly
5/16 Test #233: LinearFormsAssembly .............. Passed 97.08 sec
Start 324: RichardsDyke2
6/16 Test #324: RichardsDyke2 .................... Passed 5.44 sec
Start 332: RotatingBCPoisson3DGeneric
7/16 Test #332: RotatingBCPoisson3DGeneric ....... Passed 9.77 sec
Start 379: Step_sst-kw-wf
8/16 Test #379: Step_sst-kw-wf ................... Passed 11.23 sec
Start 411: VectorHelmholtzWaveguide
9/16 Test #411: VectorHelmholtzWaveguide ......... Passed 4.38 sec
Start 484: diffuser_sa
10/16 Test #484: diffuser_sa ...................... Passed 6.93 sec
Start 486: diffuser_v2f
11/16 Test #486: diffuser_v2f ..................... Passed 12.31 sec
Start 523: levelset3b
12/16 Test #523: levelset3b ....................... Passed 6.33 sec
Start 548: mgdyn_harmonic
13/16 Test #548: mgdyn_harmonic ................... Passed 14.90 sec
Start 555: mgdyn_steady
14/16 Test #555: mgdyn_steady ..................... Passed 9.97 sec
Start 557: mgdyn_steady_periodic
15/16 Test #557: mgdyn_steady_periodic ............ Passed 8.17 sec
Start 611: structmap3
16/16 Test #611: structmap3 ....................... Passed 9.81 sec

100% tests passed, 0 tests failed out of 16

Label Time Summary:
benchmark = 230.22 sec*proc (16 tests)
elasticsolve = 2.73 sec*proc (1 test)
matc = 14.18 sec*proc (2 tests)
serial = 230.22 sec*proc (16 tests)
threaded = 97.08 sec*proc (1 test)
whitney = 37.42 sec*proc (4 tests)

Total Test time (real) = 231.02 sec

Code: Select all

joni@mpi1:/mpi/S5/builddir/ELMER-LINUX$ which ElmerSolver
/mpi/C5/ELMER-LINUX_v2/bin/ElmerSolver
joni@mpi1:/mpi/S5/builddir/ELMER-LINUX$ which ElmerSolver_mpi
/mpi/C5/ELMER-LINUX_v2/bin/ElmerSolver_mpi
joni@mpi1:/mpi/S5/builddir/ELMER-LINUX$ ElmerSolver_mpi
ELMER SOLVER (v 8.4) STARTED AT: 2019/03/05 14:06:12
ParCommInit:  Initialize #PEs:            1
MAIN: 
MAIN: =============================================================
MAIN: ElmerSolver finite element software, Welcome!
MAIN: This program is free software licensed under (L)GPL
MAIN: Copyright 1st April 1995 - , CSC - IT Center for Science Ltd.
MAIN: Webpage http://www.csc.fi/elmer, Email elmeradm@csc.fi
MAIN: Version: 8.4 (Rev: 2b72330d, Compiled: 2019-03-05)
MAIN:  Running one task without MPI parallelization.
MAIN:  Running in parallel with 6 threads per task.
MAIN: =============================================================
ERROR:: ElmerSolver: Unable to find ELMERSOLVER_STARTINFO, can not execute.

joni@mpi1:/mpi/S5/builddir/ELMER-LINUX$ ldd ElmerSolver_mpi
ldd: ./ElmerSolver_mpi: Tiedostoa tai hakemistoa ei ole
joni@mpi1:/mpi/S5/builddir/ELMER-LINUX$ which ElmerSolver_mpi
/mpi/C5/ELMER-LINUX_v2/bin/ElmerSolver_mpi
joni@mpi1:/mpi/S5/builddir/ELMER-LINUX$ ldd /mpi/C5/ELMER-LINUX_v2/bin/ElmerSolver_mpi
	linux-vdso.so.1 (0x00007fff58713000)
	libelmersolver.so => /mpi/C5/ELMER-LINUX_v2/bin/../lib/elmersolver/libelmersolver.so (0x00007f9115740000)
	libgfortran.so.4 => /usr/lib/x86_64-linux-gnu/libgfortran.so.4 (0x00007f9115360000)
	libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f9114f68000)
	libmatc.so => /mpi/C5/ELMER-LINUX_v2/bin/../lib/elmersolver/libmatc.so (0x00007f9114d18000)
	libfhuti.so => /mpi/C5/ELMER-LINUX_v2/bin/../lib/elmersolver/libfhuti.so (0x00007f9114a08000)
	libarpack.so => /mpi/C5/ELMER-LINUX_v2/bin/../lib/elmersolver/libarpack.so (0x00007f9114790000)
	libblis.so.1 => /mpi3/C5/amd-blis-mt-1.3/lib/libblis.so.1 (0x00007f91142f8000)
	libflame.so => /mpi3/C5/amd-libFLAME-1.3/lib/libflame.so (0x00007f91132f8000)
	libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f91130f0000)
	libparpack.so => /mpi/C5/ELMER-LINUX_v2/bin/../lib/elmersolver/libparpack.so (0x00007f9112e80000)
	libmpifort.so.12 => /usr/local/lib/libmpifort.so.12 (0x00007f9112c48000)
	libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f91128a8000)
	libgomp.so.1 => /usr/lib/x86_64-linux-gnu/libgomp.so.1 (0x00007f9112678000)
	libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f9112460000)
	libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f9112240000)
	/lib64/ld-linux-x86-64.so.2 (0x00007f9116518000)
	libquadmath.so.0 => /usr/lib/x86_64-linux-gnu/libquadmath.so.0 (0x00007f9112000000)
	libmvec.so.1 => /lib/x86_64-linux-gnu/libmvec.so.1 (0x00007f9111dd0000)
	libmpi.so.12 => /usr/local/lib/libmpi.so.12 (0x00007f9111800000)
	libudev.so.1 => /lib/x86_64-linux-gnu/libudev.so.1 (0x00007f91115e0000)
	libxml2.so.2 => /usr/lib/x86_64-linux-gnu/libxml2.so.2 (0x00007f9111218000)
	librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f9111010000)
	libicuuc.so.60 => /usr/lib/x86_64-linux-gnu/libicuuc.so.60 (0x00007f9110c58000)
	libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007f9110a38000)
	liblzma.so.5 => /lib/x86_64-linux-gnu/liblzma.so.5 (0x00007f9110810000)
	libicudata.so.60 => /usr/lib/x86_64-linux-gnu/libicudata.so.60 (0x00007f910ec60000)
	libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007f910e8d0000)
	
raback
Site Admin
Posts: 4812
Joined: 22 Aug 2009, 11:57
Antispam: Yes
Location: Espoo, Finland
Contact:

Re: Performance of ElmerSolver on different platforms

Post by raback »

Hi joni,

I don't think any of the benchmark tests benefits from multithreading. However, you could try to run some threaded solvers with

Code: Select all

  ctest -L threaded
and compare that with 1 vs. 6 threads.

-Peter
joni
Posts: 22
Joined: 10 Mar 2018, 15:05
Antispam: Yes

Re: Performance of ElmerSolver on different platforms

Post by joni »

actually after 6 threads benchmarks was much, much slover,...

I could be good idea to have one mpi test for several processor whit threads,...
...., to test overall and same test for single processor & threads,... and why
not collect performace data automatically to database,... there is more important
problems.

I think for system hpcc test and their database for comparsion is good start
and some smaler system needed for elmer to compare also
available BLAS, LAPACK, SCALAPAC as well ARPACK libraryes.
Post Reply