This section explains how to build Nektar++ from the source-code package.
Nektar++ uses a number of third-party libraries. Some of these are required, others are optional. It is generally more straightforward to use versions of these libraries supplied pre-packaged for your operating system, but if you run into difficulties with compilation errors or failing regression tests, the Nektar++ build system can automatically build tried-and-tested versions of these libraries for you. This requires enabling the relevant options in the CMake configuration.
There are two ways to obtain the source code for Nektar++:
git remote set-url origin git@gitlab.nektar.info:nektar/nektar.git
Nektar++ uses a number of external programs and libraries for some or all of its functionality. Some of these are required and must be installed prior to compiling Nektar++, most of which are available as pre-built system packages on most Linux distributions or can be installed manually by a user. Typically, the development packages, with a -dev or -devel suffix, are required to compile codes against these libraries. Others are optional and required only for specific features, or can be downloaded and compiled for use with Nektar++ automatically (but not installed system-wide).
Installation |
|
||||
Package | Req. | Sys. | User | Auto. | Note |
C++ compiler | ✓ | ✓ | gcc, icc, etc, supporting C++11 |
||
CMake > 2.8.11 | ✓ | ✓ | ✓ | Ncurses GUI optional |
|
BLAS | ✓ | ✓ | ✓ | ✓ | Or MKL, ACML, OpenBLAS |
LAPACK | ✓ | ✓ | ✓ | ✓ |
|
Boost >= 1.56 | ✓ | ✓ | ✓ | ✓ | Compile with iostreams |
TinyXML | ✓ | ✓ | ✓ | ✓ | For reading XML input files |
Scotch | ✓ | ✓ | ✓ | ✓ | Required for multi-level static condensation, highly recommended |
METIS | ✓ | ✓ | ✓ | Alternative mesh partitioning |
|
FFTW > 3.0 | ✓ | ✓ | ✓ | For high-performance FFTs |
|
ARPACK > 2.0 | ✓ | ✓ | ✓ | For arnoldi algorithms |
|
MPI | ✓ | ✓ | For parallel execution (OpenMPI, MPICH, Intel MPI, etc) |
||
GSMPI | ✓ | For parallel execution |
|||
HDF5 | ✓ | ✓ | ✓ | For large-scale parallel I/O (requires CMake >3.1) |
|
OpenCascade CE | ✓ | ✓ | ✓ | For mesh generation and optimisation |
|
PETSc | ✓ | ✓ | ✓ | Alternative linear solvers |
|
PT-Scotch | ✓ | ✓ | ✓ | Required when MPI enabled |
|
Tetgen | ✓ | ✓ | ✓ | For 3D mesh generation |
|
Triangle | ✓ | ✓ | ✓ | For 2D mesh generation |
|
VTK > 5.8 | ✓ | ✓ | Not required to convert field output files to VTK, only mesh files |
||
Open a terminal.
If you have downloaded the tarball, first unpack it:
Change into the nektar++
source code directory
From a terminal:
build
subdirectory and enter it
c
NEKTAR_BUILD_
) you
would like to build. Disabling solvers which you do not require will speed up
the build process.
NEKTAR_USE_
)
for additional functionality.
THIRDPARTY_BUILD_
). Some of these
will be automatically enabled if not found on your system.
CMAKE_INSTALL_PREFIX
. By
default, this will be a dist
subdirectory within the build
directory, which is
satisfactory for most users initially.A full list of configuration options can be found in Section 1.3.5.
THIRDPARTY_BUILD_
options will request CMake to automatically
download thirdparty libraries and compile them within the Nektar++ directory. If you
have administrative access to your machine, it is recommended to install the libraries
system-wide through your package-management system.If you have installed additional system packages since running CMake, you may need to wipe your build directory and rerun CMake for them to be detected.
c
to configure the build. If errors arise relating to missing libraries, review the
THIRDPARTY_BUILD_
selections in the configuration step above or install the missing
libraries manually or from system packages.
c
again until the option g
to generate build files appears. Press g
to generate the build files and exit
CMake.
During the build, missing third-party libraries will be automatically downloaded,
configured and built in the Nektar++ build
directory.
Nektar++ uses a number of external programs and libraries for some or all of its functionality. Some of these are required and must be installed prior to compiling Nektar++, most of which are available on MacPorts (www.macports.org) or can be installed manually by a user. Others are optional and required only for specific features, or can be downloaded and compiled for use with Nektar++ automatically (but not installed system-wide).
Installation |
|
||||
Package | Req. | MacPorts | User | Auto. | Note |
Xcode | ✓ | Provides developer tools |
|||
CMake > 2.8.11 | ✓ | cmake | ✓ | Ncurses GUI optional |
|
BLAS | ✓ | Part of Xcode |
|||
LAPACK | ✓ | Part of Xcode |
|||
Boost >= 1.56 | ✓ | boost | ✓ | ✓ | Compile with iostreams |
TinyXML | ✓ | tinyxml | ✓ | ✓ |
|
Scotch | ✓ | scotch | ✓ | ✓ | Required for multi-level static condensation, highly recommended |
METIS | metis | ✓ | ✓ | Alternative mesh partitioning |
|
FFTW > 3.0 | fftw-3 | ✓ | ✓ | For high-performance FFTs |
|
ARPACK > 2.0 | arpack | ✓ | For arnoldi algorithms |
||
OpenMPI | openmpi | For parallel execution |
|||
GSMPI | ✓ | For parallel execution |
|||
HDF5 | ✓ | ✓ | For large-scale parallel I/O (requires CMake >3.1) |
||
OpenCascade CE | ✓ | ✓ | For mesh generation and optimisation |
||
PETSc | petsc | ✓ | ✓ | Alternative linear solvers |
|
PT-Scotch | ✓ | ✓ | Required when MPI enabled |
||
Tetgen | ✓ | ✓ | For 3D mesh generation |
||
Triangle | ✓ | ✓ | For 2D mesh generation |
||
VTK > 5.8 | vtk | ✓ | Not required to convert field output files to VTK, only mesh files |
||
Package names are given in the table above. Similar packages also exist in other package managers such as Homebrew.
Open a terminal (Applications->Utilities->Terminal).
If you have downloaded the tarball, first unpack it:
Change into the nektar++
source code directory
From a terminal (Applications->Utilities->Terminal):
build
subdirectory and enter it
Use the arrow keys to navigate the options and ENTER
to select/edit an option.
NEKTAR_BUILD_
) you
would like to build. Disabling solvers which you do not require will speed up
the build process.
NEKTAR_USE_
)
for additional functionality.
THIRDPARTY_BUILD_
)
CMAKE_INSTALL_PREFIX
. By
default, this will be a dist
subdirectory within the build
directory, which is
satisfactory for most users initially.A full list of configuration options can be found in Section 1.3.5.
THIRDPARTY_BUILD_
options will request CMake to automatically
download thirdparty libraries and compile them within the Nektar++ directory. If you
have administrative access to your machine, it is recommended to install the libraries
system-wide through MacPorts.
c
to configure the build. If errors arise relating to missing libraries (variables set to
NOTFOUND
), review the THIRDPARTY_BUILD_
selections in the previous step or install the
missing libraries manually or through MacPorts.
c
again until the option g
to generate build files appears. Press g
to generate the build files and exit
CMake.
During the build, missing third-party libraries will be automatically downloaded,
configured and built in the Nektar++ build
directory.
Windows compilation is supported, but the build process is somewhat convoluted at present. As such, only serial execution is supported with a minimal amount of additional build packages. These can either be installed by the user, or automatically in the build process.
Installation |
|
|||
Package | Req. | User | Auto. | Note |
MS Visual Studio | ✓ | ✓ | 2012, 2013 and 2015 known working |
|
CMake ≥ 3.0 | ✓ | ✓ |
|
|
BLAS | ✓ | ✓ | ✓ |
|
LAPACK | ✓ | ✓ | ✓ |
|
Boost ≥ 1.56 | ✓ | ✓ | ✓ | Compile with iostreams |
C:\local\boost_1_61_0
. If you use these libraries, you will
need to:
libs-msvc14.0
to lib
lib
directory, create duplicates of boost_zlib.dll
and
boost_bzip2.dll
called zlib.dll
and libbz2.dll
BOOST_HOME
environment variable. To do so, navigate to Control Panel
> System and Security > System, select Advanced System Settings, and in
the Advanced tab click the Environment Variables. In the System Variables
box, click New. In the New System Variable window, type BOOST_HOME
next
to Variable name and C:\local\boost_1_61_0
next toVariable value.nektar++-5.0.0.zip
.
++
in the name. If you think that your Windows version can not handle
path containing special characters, you should rename nektar++-5.0.0
to
nektar-5.0.0
.
builds
directory within the nektar++-5.0.0
subdirectory.
builds
directory and run the CMake graphical utility:
To build in parallel with, for example, 12 processors, issue:
builds\dist\bin
.
PATH
to include the
library directories where DLLs are stored. To do this, navigate to Control Panel
> System and Security > System, select Advanced System Settings, and in
the Advanced tab click the Environment Variables. In the System Variables
box, select Path and click Edit. To the end of this list, add the full paths to
directories:
builds\dist\lib\nektar++-5.0.0
builds\dist\bin
C:\local\boost_1_61_0 \lib
builds
directory, and then issue the command
This section describes the main configuration options which can be set when building Nektar++. The default options should work on almost all systems, but additional features (such as parallelisation and specialist libraries) can be enabled if needed.
The first set of options specify the components of the Nektar++ toolkit to compile. Some options are dependent on others being enabled, so the available options may change.
Components of the Nektar++ package can be selected using the following options:
NEKTAR_BUILD_DEMOS
(Recommended)
Compiles the demonstration programs. These are primarily used by the regression testing suite to verify the Nektar++ library, but also provide an example of the basic usage of the framework.
NEKTAR_BUILD_DOC
Compiles the Doxygen documentation for the code. This will be put in
NEKTAR_BUILD_LIBRARY
(Required)
Compiles the Nektar++ framework libraries. This is required for all other options.
NEKTAR_BUILD_PYTHON
Installs the Python wrapper to Nektar++. Requires running the following command after installing Nektar++ in order to install the Python package for the current user:
Alternatively, the Python package can be installed for all users by running the following command with appropriate priviledges:
NEKTAR_BUILD_SOLVERS
(Recommended)
Compiles the solvers distributed with the Nektar++ framework.
If enabling NEKTAR_BUILD_SOLVERS
, individual solvers can be enabled or disabled. See
Part III for the list of available solvers. You can disable solvers which are not required to
reduce compilation time. See the NEKTAR_SOLVER_X
option.
NEKTAR_BUILD_TESTS
(Recommended)
Compiles the testing program used to verify the Nektar++ framework.
NEKTAR_BUILD_TIMINGS
Compiles programs used for timing Nektar++ operations.
NEKTAR_BUILD_UNIT_TESTS
Compiles tests for checking the core library functions.
NEKTAR_BUILD_UTILITIES
Compiles utilities for pre- and post-processing simulation data, including the mesh
conversion and generation tool NekMesh
and the FieldConvert
post-processing
utility.
NEKTAR_SOLVER_X
Enable compilation of the ’X’ solver.
NEKTAR_UTILITY_X
Enable compilation of the ’X’ utility.
A number of ThirdParty libraries are required by Nektar++. There are also optional libraries which provide additional functionality. These can be selected using the following options:
NEKTAR_USE_ARPACK
Build Nektar++ with support for ARPACK. This provides routines used for linear stability analyses. Alternative Arnoldi algorithms are also implemented directly in Nektar++.
NEKTAR_USE_CCM
Use the ccmio library provided with the Star-CCM package for reading ccm files. This option is required as part of NekMesh if you wish to convert a Star-CCM mesh into the Nektar format. It is possible to read a Tecplot plt file from Star-CCM but this output currently needs to be converted to ascii format using the Tecplot package.
NEKTAR_USE_CWIPI
Use the CWIPI library for enabling inter-process communication between two solvers. Solvers may also interface with third-party solvers using this package.
NEKTAR_USE_FFTW
Build Nektar++ with support for FFTW for performing Fast Fourier Transforms (FFTs). This is used only when using domains with homogeneous coordinate directions.
NEKTAR_USE_HDF5
Build Nektar++ with support for HDF5. This enables input/output in the HDF5
parallel file format, which can be very efficient for large numbers of processes.
HDF5 output can be enabled by using a command-line option or in the SOLVERINFO
section of the XML file. This option requires that Nektar++ be built with MPI
support with NEKTAR_USE_MPI
enabled and that HDF5 is compiled with MPI
support.
NEKTAR_USE_MESHGEN
Build the NekMesh utility with support for generating meshes from CAD geometries. This enables use of the OpenCascade Community Edition library, as well as Triangle and Tetgen.
NEKTAR_USE_METIS
Build Nektar++ with support for the METIS graph partitioning library. This provides both an alternative mesh partitioning algorithm to SCOTCH for parallel simulations.
NEKTAR_USE_MPI
(Recommended)
Build Nektar++ with MPI parallelisation. This allows solvers to be run in serial or parallel.
NEKTAR_USE_PETSC
Build Nektar++ with support for the PETSc package for solving linear systems.
NEKTAR_USE_PYTHON3
(Requires NEKTAR_BUILD_PYTHON
)
Enables the generation of Python3 interfaces.
NEKTAR_USE_SCOTCH
(Recommended)
Build Nektar++ with support for the SCOTCH graph partitioning library. This provides both a mesh partitioning algorithm for parallel simulations and enabled support for multi-level static condensation, so is highly recommended and enabled by default. However for systems that do not support SCOTCH build requirements (e.g. Windows), this can be disabled.
NEKTAR_USE_SYSTEM_BLAS_LAPACK
(Recommended)
On Linux systems, use the default BLAS and LAPACK library on the system. This may not be the implementation offering the highest performance for your architecture, but it is the most likely to work without problem.
NEKTAR_USE_VTK
Build Nektar++ with support for VTK libraries. This is only needed for specialist utilities and is not needed for general use.
The THIRDPARTY_BUILD_X
options select which third-party libraries are automatically built
during the Nektar++ build process. Below are the choices of X:
ARPACK
Library of iterative Arnoldi algorithms.
BLAS_LAPACK
Library of linear algebra routines.
BOOST
The Boost libraries are frequently provided by the operating system, so automatic compilation is not enabled by default. If you do not have Boost on your system, you can enable this to have Boost configured automatically.
CCMIO
I/O library for the Star-CCM+ format.
CWIPI
Library for inter-process exchange of data between different solvers.
FFTW
Fast-Fourier transform library.
GSMPI
(MPI-only) Parallel communication library.
HDF5
Hierarchical Data Format v5 library for structured data storage.
METIS
A graph partitioning library used for mesh partitioning when Nektar++ is run in parallel.
OCE
OpenCascade Community Edition 3D modelling library.
PETSC
A package for the parallel solution of linear algebra systems.
SCOTCH
A graph partitioning library used for mesh partitioning when Nektar++ is run in parallel, and reordering routines that are used in multi-level static condensation.
TETGEN
3D tetrahedral meshing library.
TINYXML
Library for reading and writing XML files.
TRIANGLE
2D triangular meshing library.
There are also a number of additional options to fine-tune the build:
NEKTAR_DISABLE_BACKUPS
By default, Nektar++ solvers and the FieldConvert utility will not overwrite any generated field files or output files they find an existing file with the same name. Instead, the existing file will be either moved to a backup file or you will be prompted to overwrite them. If you do not want this behaviour, then enabling this option will cause all pre-existing output to be overwritten silently.
NEKTAR_TEST_ALL
Enables an extra set of more substantial and long-running tests.
NEKTAR_TEST_USE_HOSTFILE
By default, MPI tests are run directly with the mpiexec
command together with
the number of cores. If your MPI installation requires a hostfile, enabling this
option adds the command line argument -hostfile hostfile
to the command
line arguments when tests are run with ctest
or the Tester
executable.