To run the incompressible solver in serial:
IncNavierStokesSolver mesh.xml session.xml -v
where IncNavierStokesSolver
is the name of the executable, mesh.xml
is the name of the
file that inlcudes all the high-order mesh information, session.xml
is the name of the file that
describes the polynomial expansions for the pressure and velocity fields, the boundary
conditions and the numerical configuration of the problem. It is recommend to use the -v
, or
–verbose
flag which activates additional information to be printed upon execution of the
executable. It is possible to have the information from mesh.xml
and session.xml
in a
single file, for example meshAndSession.xml
. This compact format for the problem
set-up is useful for small size problems and the previous command will look like:
IncNavierStokesSolver meshAndSession.xml -v
In Nektar++, it is possible to restart a simulation from an existing pressure and velocity field. In this scenario, the simulation will resume from the latest available time-step in the restart field and the numbering of the output checkpoint files will continue from the latest index. To avoid this and set the start time of the simulation and/or the index numbering of the checkpoint files one can use:
IncNavierStokesSolver meshAndSession.xml -v --set-start-time 0 --set-start-chknumber 0
where the flag –set-start-time
receives float values that set the starting time of the
simulation and the flag –set-start-chknumber
receives int values that specify the starting
index for the output checkpoint files regardless of the settings introduced by the initialization
field.
IncNavierStokesSolver meshAndSession.xml -v > logAndErr.IncNS &
> logAndErr.IncNS
is specified to dump all the information
that is printed from the executable inside the logAndErr.IncNS
file and the error
information in case an error shows up, the & symbol is responsible for launching the
process in the background. This way, the current terminal can be used for different
actions and the job will continue to run, even if the user loses connection from the
terminal that the process was launched. To monitor the progress of the simulation
and how far it is from completion when the process is running in the background:
tail -f logAndErr.IncNS
If Nektar++ is compiled with HDF5 support, then the output fields can be exported in compressed HDF5 format using the follwing command:
IncNavierStokesSolver mesh.xml session.xml -v -i Hdf5
where the flag -i
or –io-format Hdf5
enables the IncNavierStokesSolver
executable to
write the velocity and the pressure fields in HDF5. For large scale problems where the number
of elements is above one million, it is recommended to convert the mesh in compressed HDF5
format and use the flag –use-hdf5-node-comm
to enable loading the mesh in parallel
and avoid running out of memory, in case of serial partitioning, as shown below:
IncNavierStokesSolver mesh.xml session.xml -v -i Hdf5 --use-hdf5-node-comm
If Nektar++ is configured to run in parallel as described in Section 17.1, then the following command is used:
mpirun -np 10 IncNavierStokesSolver mesh.xml session.xml -v
where mpirun
can be replaced by mpiexec
depending on the MPI protocol that is available in
the system and 10 is the number of the requested processors. A detailed summary of all the
available flags is written in Section 16.
IncNavierStokesSolver
executable, run: IncNavierStokesSolver -h