This section explains how to build Nektar++ from the source-code package.
Nektar++ uses a number of third-party libraries. Some of these are required, others are optional. It is generally more straightforward to use versions of these libraries supplied pre-packaged for your operating system, but if you run into difficulties with compilation errors or failing regression tests, the Nektar++ build system can automatically build tried-and-tested versions of these libraries for you. This requires enabling the relevant options in the CMake configuration.
There are two ways to obtain the source code for Nektar++:
git remote set-url origin git@gitlab.nektar.info:nektar/nektar.git
Nektar++ uses a number of external programs and libraries for some or all of its functionality. Some of these are required and must be installed prior to compiling Nektar++, most of which are available as pre-built system packages on most Linux distributions or can be installed manually by a user. Typically, the development packages, with a -dev or -devel suffix, are required to compile codes against these libraries. Others are optional and required only for specific features, or can be downloaded and compiled for use with Nektar++ automatically (but not installed system-wide).
Installation |
|
||||
Package | Req. | Sys. | User | Auto. | Note |
C++ compiler | ✓ | ✓ | gcc, icc, etc, supporting C++11 |
||
CMake ≥ 3.5.1 | ✓ | ✓ | ✓ | Ncurses GUI optional |
|
BLAS | ✓ | ✓ | ✓ | ✓ | Or MKL, ACML, OpenBLAS |
LAPACK | ✓ | ✓ | ✓ | ✓ |
|
Boost >= 1.56 | ✓ | ✓ | ✓ | ✓ | Compile with iostreams |
TinyXML | ✓ | ✓ | ✓ | ✓ | For reading XML input files |
Scotch | ✓ | ✓ | ✓ | ✓ | Required for multi-level static condensation, highly recommended |
METIS | ✓ | ✓ | ✓ | Alternative mesh partitioning |
|
FFTW > 3.0 | ✓ | ✓ | ✓ | For high-performance FFTs |
|
ARPACK > 2.0 | ✓ | ✓ | ✓ | For arnoldi algorithms |
|
MPI | ✓ | ✓ | For parallel execution (OpenMPI, MPICH, Intel MPI, etc) |
||
GSMPI | ✓ | For parallel execution |
|||
HDF5 | ✓ | ✓ | ✓ | For large-scale parallel I/O (requires CMake >3.1) |
|
OpenCascade CE | ✓ | ✓ | ✓ | For mesh generation and optimisation |
|
PETSc | ✓ | ✓ | ✓ | Alternative linear solvers |
|
PT-Scotch | ✓ | ✓ | ✓ | Required when MPI enabled |
|
Tetgen | ✓ | ✓ | ✓ | For 3D mesh generation |
|
Triangle | ✓ | ✓ | ✓ | For 2D mesh generation |
|
VTK > 5.8 | ✓ | ✓ | Not required to convert field output files to VTK, only mesh files |
||
Open a terminal.
If you have downloaded the tarball, first unpack it:
Change into the nektar++
source code directory
From a terminal:
build
subdirectory and enter it
c
NEKTAR_BUILD_
) you
would like to build. Disabling solvers which you do not require will speed up
the build process.
NEKTAR_USE_
)
for additional functionality.
THIRDPARTY_BUILD_
). Some of these
will be automatically enabled if not found on your system.
CMAKE_INSTALL_PREFIX
. By
default, this will be a dist
subdirectory within the build
directory, which is
satisfactory for most users initially.A full list of configuration options can be found in Section 1.3.5.
THIRDPARTY_BUILD_
options will request CMake to automatically
download thirdparty libraries and compile them within the Nektar++ directory. If you
have administrative access to your machine, it is recommended to install the libraries
system-wide through your package-management system.If you have installed additional system packages since running CMake, you may need to wipe your build directory and rerun CMake for them to be detected.
c
to configure the build. If errors arise relating to missing libraries, review the
THIRDPARTY_BUILD_
selections in the configuration step above or install the missing
libraries manually or from system packages.
c
again until the option g
to generate build files appears. Press g
to generate the build files and exit
CMake.
During the build, missing third-party libraries will be automatically downloaded,
configured and built in the Nektar++ build
directory.
Nektar++ uses a number of external programs and libraries for some or all of its functionality. Some of these are required and must be installed prior to compiling Nektar++, most of which are available on MacPorts (www.macports.org) or can be installed manually by a user. Others are optional and required only for specific features, or can be downloaded and compiled for use with Nektar++ automatically (but not installed system-wide).
Installation |
|
||||
Package | Req. | MacPorts | User | Auto. | Note |
Xcode | ✓ | Provides developer tools |
|||
CMake ≥ 3.5.1 | ✓ | cmake | ✓ | Ncurses GUI optional |
|
BLAS | ✓ | Part of Xcode |
|||
LAPACK | ✓ | Part of Xcode |
|||
Boost >= 1.56 | ✓ | boost | ✓ | ✓ | Compile with iostreams |
TinyXML | ✓ | tinyxml | ✓ | ✓ |
|
Scotch | ✓ | scotch | ✓ | ✓ | Required for multi-level static condensation, highly recommended |
METIS | metis | ✓ | ✓ | Alternative mesh partitioning |
|
FFTW > 3.0 | fftw-3 | ✓ | ✓ | For high-performance FFTs |
|
ARPACK > 2.0 | arpack | ✓ | For arnoldi algorithms |
||
OpenMPI | openmpi | For parallel execution |
|||
GSMPI | ✓ | For parallel execution |
|||
HDF5 | ✓ | ✓ | For large-scale parallel I/O (requires CMake >3.1) |
||
OpenCascade CE | ✓ | ✓ | For mesh generation and optimisation |
||
PETSc | petsc | ✓ | ✓ | Alternative linear solvers |
|
PT-Scotch | ✓ | ✓ | Required when MPI enabled |
||
Tetgen | ✓ | ✓ | For 3D mesh generation |
||
Triangle | ✓ | ✓ | For 2D mesh generation |
||
VTK > 5.8 | vtk | ✓ | Not required to convert field output files to VTK, only mesh files |
||
Package names are given in the table above. Similar packages also exist in other package managers such as Homebrew.
Open a terminal (Applications->Utilities->Terminal).
If you have downloaded the tarball, first unpack it:
Change into the nektar++
source code directory
From a terminal (Applications->Utilities->Terminal):
build
subdirectory and enter it
Use the arrow keys to navigate the options and ENTER
to select/edit an option.
NEKTAR_BUILD_
) you
would like to build. Disabling solvers which you do not require will speed up
the build process.
NEKTAR_USE_
)
for additional functionality.
THIRDPARTY_BUILD_
)
CMAKE_INSTALL_PREFIX
. By
default, this will be a dist
subdirectory within the build
directory, which is
satisfactory for most users initially.A full list of configuration options can be found in Section 1.3.5.
THIRDPARTY_BUILD_
options will request CMake to automatically
download thirdparty libraries and compile them within the Nektar++ directory. If you
have administrative access to your machine, it is recommended to install the libraries
system-wide through MacPorts.
c
to configure the build. If errors arise relating to missing libraries (variables set to
NOTFOUND
), review the THIRDPARTY_BUILD_
selections in the previous step or install the
missing libraries manually or through MacPorts.
c
again until the option g
to generate build files appears. Press g
to generate the build files and exit
CMake.
During the build, missing third-party libraries will be automatically downloaded,
configured and built in the Nektar++ build
directory.
Windows compilation is supported but there are some complexities with building additional features on this platform at present. As such, only builds with a minimal amount of additional build packages are currently supported. These can either be installed by the user, or automatically as part of the build process. Support has recently been added for building with MPI on Windows. This enables parallel computations to be carried out with Nektar++ on Windows where only sequential computations were previously supported.
Installation |
|
|||
Package | Req. | User | Auto. | Note |
MS Visual Studio | ✓ | ✓ | 2015, 2017 and 2019 known working |
|
CMake ≥ 3.5.1 | ✓ | ✓ | 3.16+ recommended, see info below |
|
BLAS | ✓ | ✓ | ✓ |
|
LAPACK | ✓ | ✓ | ✓ |
|
Boost ≥ 1.61 | ✓ | ✓ | ✓ | Recommend installing from binaries |
Microsoft MPI ≥ 10.1.2 | ✓ | Required for parallel execution. Install both runtime and SDK |
||
C:\local\boost_<version>
. We recommend installing a specific version of the binaries
depending on the version of Visual Studio you are using, these are known to be working
with the Nektar++ build:
boost_1_61_0-
msvc-14.0-64.exe
from http://sourceforge.net/projects/boost/files/boost-binaries/1.61.0/
boost_1_68_0-
msvc-14.1-64.exe
from http://sourceforge.net/projects/boost/files/boost-binaries/1.68.0/
boost_1_72_0-
msvc-14.2-64.exe
from http://sourceforge.net/projects/boost/files/boost-binaries/1.72.0/If you use these libraries, you will need to:
BOOST_HOME
environment variable. To do so, click the Start menu and
type ‘env’, you should be presented with an “Edit the system environment
variables” option. Alternatively, from the Start menu, navigate to Settings
> System > About > System info (under Related Settings on the right
hand panel), select Advanced System Settings, and in the Advanced tab click
the Environment Variables button. In the System variables box, click New.
In the New System Variable window, type BOOST_HOME
next to Variable
name and C:\local\<boost_dir>
next toVariable value, where <boost_dir>
corresponds to the directory that boost has been installed to, based on
the boost version you have installed. (e.g. boost_1_61_0, boost_1_68_0,
boost_1_72_0).
nektar++-5.3.0.zip
.
++
in the
name. If you are not using Windows 10 and think that your Windows version cannot
handle paths containing special characters, you should rename nektar++-5.3.0
to
nektar-5.3.0
.
build
directory within the nektar++-5.3.0
subdirectory. If you cloned the
source code from the git repository, your Nektar++ subdirectory will be called nektar
rather than nektar++-5.3.0
build
directory and run CMake to generate the build files.
You need to set the generator to the correct Visual Studio version using the -G switch
on the command line, e.g. for VS2019:
You can see the list of available generators using cmake –help. For VS2017 use “Visual Studio 15 2017 Win64” and for VS2015 use “Visual Studio 14 2015 Win64”.
If you want to build a parallel version of Nektar++ with MPI support, you need to add the -DNEKTAR_USE_MPI=ON switch to the cmake command, e.g.:
If you experience any issues with CMake finding pre-installed Boost, binaries
ensure that you are working in a Visual Studio command prompt that was
opened after you installed boost and set up the BOOST_HOME
environment
variable.
To build in parallel with, for example, 12 processors, issue:
build\dist\bin
.
PATH
to include the bin
directory and library directories where DLLs are stored. To do this, click the Start menu
and type ‘env’, you should be presented with an “Edit the system environment variables”
option. Alternatively, navigate to Settings > System > About > System info (under
Related Settings on the right hand panel), select Advanced System Settings, and
in the Advanced tab click the Environment Variables button. In the System
Variables box, select Path
and click Edit. Add the full paths to the following
directories to the end of the list of paths shown in the “Edit environment variable”
window:
nektar++-5.3.0\build\dist\lib\nektar++-5.3.0
nektar++-5.3.0\build\dist\bin
nektar++-5.3.0\ThirdParty
C:\local\boost_<boost_version>\<boost_lib_dir>
where boost_<boost_version> is the directory where the boost binaries were
installed to and <boost_lib_dir> is the name of the library directory within
this location, e.g. lib64-msvc-14.2 or similar depending on the version of Boost
binaries you installed.build
directory,
and then issue the command
This section describes the main configuration options which can be set when building Nektar++. The default options should work on almost all systems, but additional features (such as parallelisation and specialist libraries) can be enabled if needed.
The first set of options specify the components of the Nektar++ toolkit to compile. Some options are dependent on others being enabled, so the available options may change.
Components of the Nektar++ package can be selected using the following options:
NEKTAR_BUILD_DEMOS
(Recommended)
Compiles the demonstration programs. These are primarily used by the regression testing suite to verify the Nektar++ library, but also provide an example of the basic usage of the framework.
NEKTAR_BUILD_DOC
Compiles the Doxygen documentation for the code. This will be put in
NEKTAR_BUILD_LIBRARY
(Required)
Compiles the Nektar++ framework libraries. This is required for all other options.
NEKTAR_BUILD_PYTHON
Installs the Python wrapper to Nektar++. Requires running the following command after installing Nektar++ in order to install the Python package for the current user:
Alternatively, the Python package can be installed for all users by running the following command with appropriate priviledges:
NEKTAR_BUILD_SOLVERS
(Recommended)
Compiles the solvers distributed with the Nektar++ framework.
If enabling NEKTAR_BUILD_SOLVERS
, individual solvers can be enabled or disabled. See
Part III for the list of available solvers. You can disable solvers which are not required to
reduce compilation time. See the NEKTAR_SOLVER_X
option.
NEKTAR_BUILD_TESTS
(Recommended)
Compiles the testing program used to verify the Nektar++ framework.
NEKTAR_BUILD_TIMINGS
Compiles programs used for timing Nektar++ operations.
NEKTAR_BUILD_UNIT_TESTS
Compiles tests for checking the core library functions.
NEKTAR_BUILD_UTILITIES
Compiles utilities for pre- and post-processing simulation data, including the mesh
conversion and generation tool NekMesh
and the FieldConvert
post-processing
utility.
NEKTAR_SOLVER_X
Enable compilation of the ’X’ solver.
NEKTAR_UTILITY_X
Enable compilation of the ’X’ utility.
A number of ThirdParty libraries are required by Nektar++. There are also optional libraries which provide additional functionality. These can be selected using the following options:
NEKTAR_USE_ARPACK
Build Nektar++ with support for ARPACK. This provides routines used for linear stability analyses. Alternative Arnoldi algorithms are also implemented directly in Nektar++.
NEKTAR_USE_CCM
Use the ccmio library provided with the Star-CCM package for reading ccm files. This option is required as part of NekMesh if you wish to convert a Star-CCM mesh into the Nektar format. It is possible to read a Tecplot plt file from Star-CCM but this output currently needs to be converted to ascii format using the Tecplot package.
NEKTAR_USE_CWIPI
Use the CWIPI library for enabling inter-process communication between two solvers. Solvers may also interface with third-party solvers using this package.
NEKTAR_USE_FFTW
Build Nektar++ with support for FFTW for performing Fast Fourier Transforms (FFTs). This is used only when using domains with homogeneous coordinate directions.
NEKTAR_USE_HDF5
Build Nektar++ with support for HDF5. This enables input/output in the HDF5
parallel file format, which can be very efficient for large numbers of processes.
HDF5 output can be enabled by using a command-line option or in the SOLVERINFO
section of the XML file. This option requires that Nektar++ be built with MPI
support with NEKTAR_USE_MPI
enabled and that HDF5 is compiled with MPI
support.
NEKTAR_USE_MESHGEN
Build the NekMesh utility with support for generating meshes from CAD geometries. This enables use of the OpenCascade Community Edition library, as well as Triangle and Tetgen.
NEKTAR_USE_METIS
Build Nektar++ with support for the METIS graph partitioning library. This provides both an alternative mesh partitioning algorithm to SCOTCH for parallel simulations.
NEKTAR_USE_MPI
(Recommended)
Build Nektar++ with MPI parallelisation. This allows solvers to be run in serial or parallel.
NEKTAR_USE_PETSC
Build Nektar++ with support for the PETSc package for solving linear systems.
NEKTAR_USE_PYTHON3
(Requires NEKTAR_BUILD_PYTHON
)
Enables the generation of Python3 interfaces.
NEKTAR_USE_SCOTCH
(Recommended)
Build Nektar++ with support for the SCOTCH graph partitioning library. This provides both a mesh partitioning algorithm for parallel simulations and enabled support for multi-level static condensation, so is highly recommended and enabled by default. However for systems that do not support SCOTCH build requirements (e.g. Windows), this can be disabled.
NEKTAR_USE_SYSTEM_BLAS_LAPACK
(Recommended)
On Linux systems, use the default BLAS and LAPACK library on the system. This may not be the implementation offering the highest performance for your architecture, but it is the most likely to work without problem.
NEKTAR_USE_VTK
Build Nektar++ with support for VTK libraries. This is only needed for specialist utilities and is not needed for general use.
The THIRDPARTY_BUILD_X
options select which third-party libraries are automatically built
during the Nektar++ build process. Below are the choices of X:
ARPACK
Library of iterative Arnoldi algorithms.
BLAS_LAPACK
Library of linear algebra routines.
BOOST
The Boost libraries are frequently provided by the operating system, so automatic compilation is not enabled by default. If you do not have Boost on your system, you can enable this to have Boost configured automatically.
CCMIO
I/O library for the Star-CCM+ format.
CWIPI
Library for inter-process exchange of data between different solvers.
FFTW
Fast-Fourier transform library.
GSMPI
(MPI-only) Parallel communication library.
HDF5
Hierarchical Data Format v5 library for structured data storage.
METIS
A graph partitioning library used for mesh partitioning when Nektar++ is run in parallel.
OCE
OpenCascade Community Edition 3D modelling library.
PETSC
A package for the parallel solution of linear algebra systems.
SCOTCH
A graph partitioning library used for mesh partitioning when Nektar++ is run in parallel, and reordering routines that are used in multi-level static condensation.
TETGEN
3D tetrahedral meshing library.
TINYXML
Library for reading and writing XML files.
TRIANGLE
2D triangular meshing library.
There are also a number of additional options to fine-tune the build:
NEKTAR_TEST_ALL
Enables an extra set of more substantial and long-running tests.
NEKTAR_TEST_USE_HOSTFILE
By default, MPI tests are run directly with the mpiexec
command together with
the number of cores. If your MPI installation requires a hostfile, enabling this
option adds the command line argument -hostfile hostfile
to the command
line arguments when tests are run with ctest
or the Tester
executable.
We have recently added explicit support to SIMD (Single Instruction Multiple Data) x86 instruction set extensions (i.e. AVX2, AVX512). Selected operators (the matrix free operators) utilize the SIMD types, if none of them is enabled these operators default to scalar types. The various extensions available are marked as advanced options (to visualize them in the cmake gui you need to press the t-button):
NEKTAR_ENABLE_SIMD_AVX2
Enables 256 bit wide vector types and set the appropriate compiler flags (gcc only).
NEKTAR_ENABLE_SIMD_AVX512
Enables 512 bit wide vector types and set the appropriate compiler flags (gcc only).
Note that if you are not configuring cmake for the first time, you need to delete the cached
variable CMAKE_CXX_FLAGS
in order for the appropriate flags to be set. Alternatively you can
manually set the flag to target the appropriate architecture.