Nektar++
Classes | Public Member Functions | Protected Member Functions | Protected Attributes | Static Private Member Functions | Static Private Attributes | List of all members
Nektar::MultiRegions::GlobalLinSysPETSc Class Referenceabstract

A PETSc global linear system. More...

#include <GlobalLinSysPETSc.h>

Inheritance diagram for Nektar::MultiRegions::GlobalLinSysPETSc:
[legend]

Classes

struct  ShellCtx
 Internal struct for MatShell and PCShell calls to store current context for callback. More...
 

Public Member Functions

 GlobalLinSysPETSc (const GlobalLinSysKey &pKey, const std::weak_ptr< ExpList > &pExp, const std::shared_ptr< AssemblyMap > &pLocToGloMap)
 Constructor for full direct matrix solve. More...
 
 ~GlobalLinSysPETSc () override
 Clean up PETSc objects. More...
 
- Public Member Functions inherited from Nektar::MultiRegions::GlobalLinSys
 GlobalLinSys (const GlobalLinSysKey &pKey, const std::weak_ptr< ExpList > &pExpList, const std::shared_ptr< AssemblyMap > &pLocToGloMap)
 Constructor for full direct matrix solve. More...
 
virtual ~GlobalLinSys ()
 
const GlobalLinSysKeyGetKey (void) const
 Returns the key associated with the system. More...
 
const std::weak_ptr< ExpList > & GetLocMat (void) const
 
void InitObject ()
 
void Initialise (const std::shared_ptr< AssemblyMap > &pLocToGloMap)
 
void Solve (const Array< OneD, const NekDouble > &in, Array< OneD, NekDouble > &out, const AssemblyMapSharedPtr &locToGloMap, const Array< OneD, const NekDouble > &dirForcing=NullNekDouble1DArray)
 Solve the linear system for given input and output vectors using a specified local to global map. More...
 
std::shared_ptr< GlobalLinSysGetSharedThisPtr ()
 Returns a shared pointer to the current object. More...
 
int GetNumBlocks ()
 
DNekScalMatSharedPtr GetBlock (unsigned int n)
 
void DropBlock (unsigned int n)
 
DNekScalBlkMatSharedPtr GetStaticCondBlock (unsigned int n)
 
void DropStaticCondBlock (unsigned int n)
 
void SolveLinearSystem (const int pNumRows, const Array< OneD, const NekDouble > &pInput, Array< OneD, NekDouble > &pOutput, const AssemblyMapSharedPtr &locToGloMap, const int pNumDir=0)
 Solve the linear system for given input and output vectors. More...
 

Protected Member Functions

void SetUpScatter ()
 Set up PETSc local (equivalent to Nektar++ global) and global (equivalent to universal) scatter maps. More...
 
void SetUpMatVec (int nGlobal, int nDir)
 Construct PETSc matrix and vector handles. More...
 
void SetUpSolver (NekDouble tolerance)
 Set up KSP solver object. More...
 
void CalculateReordering (const Array< OneD, const int > &glo2uniMap, const Array< OneD, const int > &glo2unique, const AssemblyMapSharedPtr &pLocToGloMap)
 Calculate a reordering of universal IDs for PETSc. More...
 
void v_SolveLinearSystem (const int pNumRows, const Array< OneD, const NekDouble > &pInput, Array< OneD, NekDouble > &pOutput, const AssemblyMapSharedPtr &locToGloMap, const int pNumDir) override
 Solve linear system using PETSc. More...
 
virtual void v_DoMatrixMultiply (const Array< OneD, const NekDouble > &pInput, Array< OneD, NekDouble > &pOutput)=0
 
- Protected Member Functions inherited from Nektar::MultiRegions::GlobalLinSys
virtual void v_Solve (const Array< OneD, const NekDouble > &in, Array< OneD, NekDouble > &out, const AssemblyMapSharedPtr &locToGloMap, const Array< OneD, const NekDouble > &dirForcing=NullNekDouble1DArray)=0
 Solve a linear system based on mapping. More...
 
virtual void v_SolveLinearSystem (const int pNumRows, const Array< OneD, const NekDouble > &pInput, Array< OneD, NekDouble > &pOutput, const AssemblyMapSharedPtr &locToGloMap, const int pNumDir)=0
 Solve a basic matrix system. More...
 
virtual void v_InitObject ()
 
virtual void v_Initialise (const std::shared_ptr< AssemblyMap > &pLocToGloMap)
 
virtual int v_GetNumBlocks ()
 Get the number of blocks in this system. More...
 
virtual DNekScalMatSharedPtr v_GetBlock (unsigned int n)
 Retrieves the block matrix from n-th expansion using the matrix key provided by the m_linSysKey. More...
 
virtual void v_DropBlock (unsigned int n)
 Releases the local block matrix from NekManager of n-th expansion using the matrix key provided by the m_linSysKey. More...
 
virtual DNekScalBlkMatSharedPtr v_GetStaticCondBlock (unsigned int n)
 Retrieves a the static condensation block matrices from n-th expansion using the matrix key provided by the m_linSysKey. More...
 
virtual void v_DropStaticCondBlock (unsigned int n)
 Releases the static condensation block matrices from NekManager of n-th expansion using the matrix key provided by the m_linSysKey. More...
 
PreconditionerSharedPtr CreatePrecon (AssemblyMapSharedPtr asmMap)
 Create a preconditioner object from the parameters defined in the supplied assembly map. More...
 

Protected Attributes

Mat m_matrix
 PETSc matrix object. More...
 
Vec m_x
 PETSc vector objects used for local storage. More...
 
Vec m_b
 
Vec m_locVec
 
KSP m_ksp
 KSP object that represents solver system. More...
 
PC m_pc
 PCShell for preconditioner. More...
 
PETScMatMult m_matMult
 Enumerator to select matrix multiplication type. More...
 
std::vector< int > m_reorderedMap
 Reordering that takes universal IDs to a unique row in the PETSc matrix. More...
 
VecScatter m_ctx
 PETSc scatter context that takes us between Nektar++ global ordering and PETSc vector ordering. More...
 
int m_nLocal
 Number of unique degrees of freedom on this process. More...
 
PreconditionerSharedPtr m_precon
 
- Protected Attributes inherited from Nektar::MultiRegions::GlobalLinSys
const GlobalLinSysKey m_linSysKey
 Key associated with this linear system. More...
 
const std::weak_ptr< ExpListm_expList
 Local Matrix System. More...
 
const std::map< int, RobinBCInfoSharedPtrm_robinBCInfo
 Robin boundary info. More...
 
bool m_verbose
 

Static Private Member Functions

static PetscErrorCode DoMatrixMultiply (Mat M, Vec in, Vec out)
 Perform matrix multiplication using Nektar++ routines. More...
 
static PetscErrorCode DoPreconditioner (PC pc, Vec in, Vec out)
 Apply preconditioning using Nektar++ routines. More...
 
static void DoNekppOperation (Vec &in, Vec &out, ShellCtx *ctx, bool precon)
 Perform either matrix multiplication or preconditioning using Nektar++ routines. More...
 
static PetscErrorCode DoDestroyMatCtx (Mat M)
 Destroy matrix shell context object. More...
 
static PetscErrorCode DoDestroyPCCtx (PC pc)
 Destroy preconditioner context object. More...
 

Static Private Attributes

static std::string matMult
 
static std::string matMultIds []
 

Detailed Description

A PETSc global linear system.

Solves a linear system using PETSc.

Solves a linear system using single- or multi-level static condensation.

Definition at line 56 of file GlobalLinSysPETSc.h.

Constructor & Destructor Documentation

◆ GlobalLinSysPETSc()

Nektar::MultiRegions::GlobalLinSysPETSc::GlobalLinSysPETSc ( const GlobalLinSysKey pKey,
const std::weak_ptr< ExpList > &  pExp,
const std::shared_ptr< AssemblyMap > &  pLocToGloMap 
)

Constructor for full direct matrix solve.

Definition at line 59 of file GlobalLinSysPETSc.cpp.

62 : GlobalLinSys(pKey, pExp, pLocToGloMap)
63{
64 // Determine whether to use standard sparse matrix approach or
65 // shell.
66 m_matMult = pExp.lock()->GetSession()->GetSolverInfoAsEnum<PETScMatMult>(
67 "PETScMatMult");
68
69 // Check PETSc is initialized. For some reason, this is needed on
70 // OS X as logging is not initialized properly in the call within
71 // CommMpi.
72 PetscBool isInitialized;
73 PetscInitialized(&isInitialized);
74 if (!isInitialized)
75 {
76#ifdef NEKTAR_USE_MPI
77 std::string commType =
78 m_expList.lock()->GetSession()->GetComm()->GetType();
79 if (commType.find("MPI") != std::string::npos)
80 {
82 std::static_pointer_cast<LibUtilities::CommMpi>(
83 m_expList.lock()->GetSession()->GetComm());
84 PETSC_COMM_WORLD = comm->GetComm();
85 }
86#endif
87 PetscInitializeNoArguments();
88 }
89
90 // Create matrix
91 MatCreate(PETSC_COMM_WORLD, &m_matrix);
92}
const std::weak_ptr< ExpList > m_expList
Local Matrix System.
Definition: GlobalLinSys.h:122
GlobalLinSys(const GlobalLinSysKey &pKey, const std::weak_ptr< ExpList > &pExpList, const std::shared_ptr< AssemblyMap > &pLocToGloMap)
Constructor for full direct matrix solve.
PETScMatMult m_matMult
Enumerator to select matrix multiplication type.
std::shared_ptr< CommMpi > CommMpiSharedPtr
Pointer to a Communicator object.
Definition: CommMpi.h:56

References Nektar::MultiRegions::GlobalLinSys::m_expList, m_matMult, and m_matrix.

◆ ~GlobalLinSysPETSc()

Nektar::MultiRegions::GlobalLinSysPETSc::~GlobalLinSysPETSc ( )
override

Clean up PETSc objects.

Note that if SessionReader::Finalize is called before the end of the program, PETSc may have been finalized already, at which point we cannot deallocate our objects. If that's the case we do nothing and let the kernel clear up after us.

Definition at line 102 of file GlobalLinSysPETSc.cpp.

103{
104 PetscBool isFinalized;
105 PetscFinalized(&isFinalized);
106
107 // Sometimes, PetscFinalized returns false when (in fact) CommMpi's
108 // Finalise routine has been called. We therefore also need to check
109 // whether MPI has been finalised. This might arise from the
110 // additional call to PetscInitializeNoArguments in the constructor
111 // above.
112#ifdef NEKTAR_USE_MPI
113 int mpiFinal = 0;
114 MPI_Finalized(&mpiFinal);
115 isFinalized = isFinalized || mpiFinal ? PETSC_TRUE : PETSC_FALSE;
116#endif
117
118 if (!isFinalized)
119 {
120 KSPDestroy(&m_ksp);
121 PCDestroy(&m_pc);
122 MatDestroy(&m_matrix);
123 VecDestroy(&m_x);
124 VecDestroy(&m_b);
125 VecDestroy(&m_locVec);
126 }
127}
Vec m_x
PETSc vector objects used for local storage.
PC m_pc
PCShell for preconditioner.
KSP m_ksp
KSP object that represents solver system.

References m_b, m_ksp, m_locVec, m_matrix, m_pc, and m_x.

Member Function Documentation

◆ CalculateReordering()

void Nektar::MultiRegions::GlobalLinSysPETSc::CalculateReordering ( const Array< OneD, const int > &  glo2uniMap,
const Array< OneD, const int > &  glo2unique,
const AssemblyMapSharedPtr pLocToGloMap 
)
protected

Calculate a reordering of universal IDs for PETSc.

PETSc requires a unique, contiguous index of all global and universal degrees of freedom which represents its position inside the matrix. Presently Gs does not guarantee this, so this routine constructs a new universal mapping.

Parameters
glo2uniMapGlobal to universal map
glo2uniqueGlobal to unique map
pLocToGloMapAssembly map for this system

Definition at line 226 of file GlobalLinSysPETSc.cpp.

230{
232 m_expList.lock()->GetSession()->GetComm();
233
234 const int nDirDofs = pLocToGloMap->GetNumGlobalDirBndCoeffs();
235 const int nHomDofs = glo2uniMap.size() - nDirDofs;
236 const int nProc = vComm->GetSize();
237 const int rank = vComm->GetRank();
238
239 int n, cnt;
240
241 // Count number of unique degrees of freedom on each process.
242 m_nLocal = Vmath::Vsum(nHomDofs, glo2unique + nDirDofs, 1);
243 m_reorderedMap.resize(nHomDofs);
244
245 // Reduce coefficient counts across all processors.
246 Array<OneD, int> localCounts(nProc, 0), localOffset(nProc, 0);
247 localCounts[rank] = nHomDofs;
248 vComm->AllReduce(localCounts, LibUtilities::ReduceSum);
249
250 for (n = 1; n < nProc; ++n)
251 {
252 localOffset[n] = localOffset[n - 1] + localCounts[n - 1];
253 }
254
255 int totHomDofs = Vmath::Vsum(nProc, localCounts, 1);
256 vector<unsigned int> allUniIds(totHomDofs, 0);
257
258 // Assemble list of universal IDs
259 for (n = 0; n < nHomDofs; ++n)
260 {
261 int gid = n + nDirDofs;
262 allUniIds[n + localOffset[rank]] = glo2uniMap[gid];
263 }
264
265 // Reduce this across processors so that each process has a list of
266 // all universal IDs.
267 vComm->AllReduce(allUniIds, LibUtilities::ReduceSum);
268 std::sort(allUniIds.begin(), allUniIds.end());
269 map<int, int> uniIdReorder;
270
271 // Renumber starting from 0.
272 for (cnt = n = 0; n < allUniIds.size(); ++n)
273 {
274 if (uniIdReorder.count(allUniIds[n]) > 0)
275 {
276 continue;
277 }
278
279 uniIdReorder[allUniIds[n]] = cnt++;
280 }
281
282 // Populate reordering map.
283 for (n = 0; n < nHomDofs; ++n)
284 {
285 int gid = n + nDirDofs;
286 int uniId = glo2uniMap[gid];
287 ASSERTL0(uniIdReorder.count(uniId) > 0, "Error in ordering");
288 m_reorderedMap[n] = uniIdReorder[uniId];
289 }
290}
#define ASSERTL0(condition, msg)
Definition: ErrorUtil.hpp:208
std::vector< int > m_reorderedMap
Reordering that takes universal IDs to a unique row in the PETSc matrix.
int m_nLocal
Number of unique degrees of freedom on this process.
std::shared_ptr< Comm > CommSharedPtr
Pointer to a Communicator object.
Definition: Comm.h:55
T Vsum(int n, const T *x, const int incx)
Subtract return sum(x)
Definition: Vmath.hpp:608

References ASSERTL0, Nektar::MultiRegions::GlobalLinSys::m_expList, m_nLocal, m_reorderedMap, Nektar::LibUtilities::ReduceSum, and Vmath::Vsum().

Referenced by Nektar::MultiRegions::GlobalLinSysPETScFull::GlobalLinSysPETScFull(), and Nektar::MultiRegions::GlobalLinSysPETScStaticCond::v_AssembleSchurComplement().

◆ DoDestroyMatCtx()

PetscErrorCode Nektar::MultiRegions::GlobalLinSysPETSc::DoDestroyMatCtx ( Mat  M)
staticprivate

Destroy matrix shell context object.

Note the matrix shell and preconditioner share a common context so this might have already been deallocated below, in which case we do nothing.

Parameters
MMatrix shell object

Definition at line 474 of file GlobalLinSysPETSc.cpp.

475{
476 void *ptr;
477 MatShellGetContext(M, &ptr);
478 ShellCtx *ctx = (ShellCtx *)ptr;
479 delete ctx;
480 return 0;
481}

Referenced by SetUpMatVec().

◆ DoDestroyPCCtx()

PetscErrorCode Nektar::MultiRegions::GlobalLinSysPETSc::DoDestroyPCCtx ( PC  pc)
staticprivate

Destroy preconditioner context object.

Note the matrix shell and preconditioner share a common context so this might have already been deallocated above, in which case we do nothing.

Parameters
pcPreconditioner object

Definition at line 492 of file GlobalLinSysPETSc.cpp.

493{
494 void *ptr;
495 PCShellGetContext(pc, &ptr);
496 ShellCtx *ctx = (ShellCtx *)ptr;
497 delete ctx;
498 return 0;
499}

Referenced by SetUpMatVec().

◆ DoMatrixMultiply()

PetscErrorCode Nektar::MultiRegions::GlobalLinSysPETSc::DoMatrixMultiply ( Mat  M,
Vec  in,
Vec  out 
)
staticprivate

Perform matrix multiplication using Nektar++ routines.

This static function uses Nektar++ routines to calculate the matrix-vector product of M with in, storing the output in out.

Parameters
MOriginal MatShell matrix, which stores the ShellCtx object.
inInput vector.
outOutput vector.

Definition at line 428 of file GlobalLinSysPETSc.cpp.

429{
430 // Grab our shell context from M.
431 void *ptr;
432 MatShellGetContext(M, &ptr);
433 ShellCtx *ctx = (ShellCtx *)ptr;
434
435 DoNekppOperation(in, out, ctx, false);
436
437 // Must return 0, otherwise PETSc complains.
438 return 0;
439}
static void DoNekppOperation(Vec &in, Vec &out, ShellCtx *ctx, bool precon)
Perform either matrix multiplication or preconditioning using Nektar++ routines.

References DoNekppOperation().

Referenced by SetUpMatVec().

◆ DoNekppOperation()

void Nektar::MultiRegions::GlobalLinSysPETSc::DoNekppOperation ( Vec &  in,
Vec &  out,
ShellCtx ctx,
bool  precon 
)
staticprivate

Perform either matrix multiplication or preconditioning using Nektar++ routines.

This static function uses Nektar++ routines to calculate the matrix-vector product of M with in, storing the output in out.

Todo:
There's a lot of scatters and copies that might possibly be eliminated to make this more efficient.
Parameters
inInput vector.
outOutput vector.
ctxShellCtx object that points to our instance of GlobalLinSysPETSc.
preconIf true, we apply a preconditioner, if false, we perform a matrix multiplication.

Definition at line 374 of file GlobalLinSysPETSc.cpp.

376{
377 const int nGlobal = ctx->nGlobal;
378 const int nDir = ctx->nDir;
379 const int nHomDofs = nGlobal - nDir;
380 GlobalLinSysPETSc *linSys = ctx->linSys;
381
382 // Scatter from PETSc ordering to our local ordering. It's actually
383 // unclear whether this step might also do some communication in
384 // parallel, which is probably not ideal.
385 VecScatterBegin(linSys->m_ctx, in, linSys->m_locVec, INSERT_VALUES,
386 SCATTER_FORWARD);
387 VecScatterEnd(linSys->m_ctx, in, linSys->m_locVec, INSERT_VALUES,
388 SCATTER_FORWARD);
389
390 // Temporary storage to pass to Nektar++
391 Array<OneD, NekDouble> tmpIn(nHomDofs), tmpOut(nHomDofs);
392
393 // Get values from input vector and copy to tmpIn.
394 PetscScalar *tmpLocIn;
395 VecGetArray(linSys->m_locVec, &tmpLocIn);
396 Vmath::Vcopy(nHomDofs, tmpLocIn, 1, &tmpIn[0], 1);
397 VecRestoreArray(linSys->m_locVec, &tmpLocIn);
398
399 // Do matrix multiply in Nektar++, store in tmpOut.
400 if (precon)
401 {
402 linSys->m_precon->DoPreconditioner(tmpIn, tmpOut);
403 }
404 else
405 {
406 linSys->v_DoMatrixMultiply(tmpIn, tmpOut);
407 }
408
409 // Scatter back to PETSc ordering and put in out.
410 VecSetValues(out, nHomDofs, &linSys->m_reorderedMap[0], &tmpOut[0],
411 INSERT_VALUES);
412 VecAssemblyBegin(out);
413 VecAssemblyEnd(out);
414}
GlobalLinSysPETSc(const GlobalLinSysKey &pKey, const std::weak_ptr< ExpList > &pExp, const std::shared_ptr< AssemblyMap > &pLocToGloMap)
Constructor for full direct matrix solve.
void Vcopy(int n, const T *x, const int incx, T *y, const int incy)
Definition: Vmath.hpp:825

References Nektar::MultiRegions::GlobalLinSysPETSc::ShellCtx::linSys, m_ctx, m_locVec, m_precon, m_reorderedMap, Nektar::MultiRegions::GlobalLinSysPETSc::ShellCtx::nDir, Nektar::MultiRegions::GlobalLinSysPETSc::ShellCtx::nGlobal, v_DoMatrixMultiply(), and Vmath::Vcopy().

Referenced by DoMatrixMultiply(), and DoPreconditioner().

◆ DoPreconditioner()

PetscErrorCode Nektar::MultiRegions::GlobalLinSysPETSc::DoPreconditioner ( PC  pc,
Vec  in,
Vec  out 
)
staticprivate

Apply preconditioning using Nektar++ routines.

This static function uses Nektar++ routines to apply the preconditioner stored in GlobalLinSysPETSc::m_precon from the context of pc to the vector in, storing the output in out.

Parameters
pcPreconditioner object that stores the ShellCtx.
inInput vector.
outOutput vector.

Definition at line 452 of file GlobalLinSysPETSc.cpp.

453{
454 // Grab our PCShell context from pc.
455 void *ptr;
456 PCShellGetContext(pc, &ptr);
457 ShellCtx *ctx = (ShellCtx *)ptr;
458
459 DoNekppOperation(in, out, ctx, true);
460
461 // Must return 0, otherwise PETSc complains.
462 return 0;
463}

References DoNekppOperation().

Referenced by SetUpMatVec().

◆ SetUpMatVec()

void Nektar::MultiRegions::GlobalLinSysPETSc::SetUpMatVec ( int  nGlobal,
int  nDir 
)
protected

Construct PETSc matrix and vector handles.

Todo:
Preallocation should be done at this point, since presently matrix allocation takes a significant amount of time.
Parameters
nGlobalNumber of global degrees of freedom in the system (on this processor)
nDirNumber of Dirichlet degrees of freedom (on this processor).

Definition at line 303 of file GlobalLinSysPETSc.cpp.

304{
305 // CREATE VECTORS
306 VecCreate(PETSC_COMM_WORLD, &m_x);
307 VecSetSizes(m_x, m_nLocal, PETSC_DECIDE);
308 VecSetFromOptions(m_x);
309 VecDuplicate(m_x, &m_b);
310
311 // CREATE MATRICES
313 {
314 // Create ShellCtx context object which will store the matrix
315 // size and a pointer to the linear system. We do this so that
316 // we can call a member function to the matrix-vector and
317 // preconditioning multiplication in a subclass.
318 ShellCtx *ctx1 = new ShellCtx(), *ctx2 = new ShellCtx();
319 ctx1->nGlobal = ctx2->nGlobal = nGlobal;
320 ctx1->nDir = ctx2->nDir = nDir;
321 ctx1->linSys = ctx2->linSys = this;
322
323 // Set up MatShell object.
324 MatCreateShell(PETSC_COMM_WORLD, m_nLocal, m_nLocal, PETSC_DETERMINE,
325 PETSC_DETERMINE, (void *)ctx1, &m_matrix);
326 MatShellSetOperation(m_matrix, MATOP_MULT,
327 (void (*)(void))DoMatrixMultiply);
328 MatShellSetOperation(m_matrix, MATOP_DESTROY,
329 (void (*)(void))DoDestroyMatCtx);
330
331 // Create a PCShell to go alongside the MatShell.
332 PCCreate(PETSC_COMM_WORLD, &m_pc);
333#if PETSC_VERSION_GE(3, 5, 0)
334 PCSetOperators(m_pc, m_matrix, m_matrix);
335#else
336 PCSetOperators(m_pc, m_matrix, m_matrix, SAME_NONZERO_PATTERN);
337#endif
338 PCSetType(m_pc, PCSHELL);
339 PCShellSetApply(m_pc, DoPreconditioner);
340 PCShellSetDestroy(m_pc, DoDestroyPCCtx);
341 PCShellSetContext(m_pc, ctx2);
342 }
343 else
344 {
345 // Otherwise we create a PETSc matrix and use MatSetFromOptions
346 // so that we can set various options on the command line.
347 MatCreate(PETSC_COMM_WORLD, &m_matrix);
348 MatSetType(m_matrix, MATAIJ);
349 MatSetSizes(m_matrix, m_nLocal, m_nLocal, PETSC_DETERMINE,
350 PETSC_DETERMINE);
351 MatSetFromOptions(m_matrix);
352 MatSetUp(m_matrix);
353 }
354}
static PetscErrorCode DoPreconditioner(PC pc, Vec in, Vec out)
Apply preconditioning using Nektar++ routines.
static PetscErrorCode DoMatrixMultiply(Mat M, Vec in, Vec out)
Perform matrix multiplication using Nektar++ routines.
static PetscErrorCode DoDestroyMatCtx(Mat M)
Destroy matrix shell context object.
static PetscErrorCode DoDestroyPCCtx(PC pc)
Destroy preconditioner context object.

References DoDestroyMatCtx(), DoDestroyPCCtx(), DoMatrixMultiply(), DoPreconditioner(), Nektar::MultiRegions::ePETScMatMultShell, Nektar::MultiRegions::GlobalLinSysPETSc::ShellCtx::linSys, m_b, m_matMult, m_matrix, m_nLocal, m_pc, m_x, Nektar::MultiRegions::GlobalLinSysPETSc::ShellCtx::nDir, and Nektar::MultiRegions::GlobalLinSysPETSc::ShellCtx::nGlobal.

Referenced by Nektar::MultiRegions::GlobalLinSysPETScFull::GlobalLinSysPETScFull(), and Nektar::MultiRegions::GlobalLinSysPETScStaticCond::v_AssembleSchurComplement().

◆ SetUpScatter()

void Nektar::MultiRegions::GlobalLinSysPETSc::SetUpScatter ( )
protected

Set up PETSc local (equivalent to Nektar++ global) and global (equivalent to universal) scatter maps.

These maps are used in GlobalLinSysPETSc::v_SolveLinearSystem to scatter the solution vector back to each process.

Definition at line 191 of file GlobalLinSysPETSc.cpp.

192{
193 const int nHomDofs = m_reorderedMap.size();
194
195 // Create local and global numbering systems for vector
196 IS isGlobal, isLocal;
197 ISCreateGeneral(PETSC_COMM_SELF, nHomDofs, &m_reorderedMap[0],
198 PETSC_COPY_VALUES, &isGlobal);
199 ISCreateStride(PETSC_COMM_SELF, nHomDofs, 0, 1, &isLocal);
200
201 // Create local vector for output
202 VecCreate(PETSC_COMM_SELF, &m_locVec);
203 VecSetSizes(m_locVec, nHomDofs, PETSC_DECIDE);
204 VecSetFromOptions(m_locVec);
205
206 // Create scatter context
207 VecScatterCreate(m_x, isGlobal, m_locVec, isLocal, &m_ctx);
208
209 // Clean up
210 ISDestroy(&isGlobal);
211 ISDestroy(&isLocal);
212}
VecScatter m_ctx
PETSc scatter context that takes us between Nektar++ global ordering and PETSc vector ordering.

References m_ctx, m_locVec, m_reorderedMap, and m_x.

Referenced by Nektar::MultiRegions::GlobalLinSysPETScFull::GlobalLinSysPETScFull(), and Nektar::MultiRegions::GlobalLinSysPETScStaticCond::v_AssembleSchurComplement().

◆ SetUpSolver()

void Nektar::MultiRegions::GlobalLinSysPETSc::SetUpSolver ( NekDouble  tolerance)
protected

Set up KSP solver object.

This is reasonably generic setup – most solver types can be changed using the .petscrc file.

Parameters
toleranceResidual tolerance to converge to.

Definition at line 509 of file GlobalLinSysPETSc.cpp.

510{
511 KSPCreate(PETSC_COMM_WORLD, &m_ksp);
512 KSPSetTolerances(m_ksp, tolerance, PETSC_DEFAULT, PETSC_DEFAULT,
513 PETSC_DEFAULT);
514 KSPSetFromOptions(m_ksp);
515#if PETSC_VERSION_GE(3, 5, 0)
516 KSPSetOperators(m_ksp, m_matrix, m_matrix);
517#else
518 KSPSetOperators(m_ksp, m_matrix, m_matrix, SAME_NONZERO_PATTERN);
519#endif
520
522 {
523 KSPSetPC(m_ksp, m_pc);
524 }
525}

References Nektar::MultiRegions::ePETScMatMultShell, m_ksp, m_matMult, m_matrix, and m_pc.

Referenced by Nektar::MultiRegions::GlobalLinSysPETScFull::GlobalLinSysPETScFull(), and Nektar::MultiRegions::GlobalLinSysPETScStaticCond::v_AssembleSchurComplement().

◆ v_DoMatrixMultiply()

virtual void Nektar::MultiRegions::GlobalLinSysPETSc::v_DoMatrixMultiply ( const Array< OneD, const NekDouble > &  pInput,
Array< OneD, NekDouble > &  pOutput 
)
protectedpure virtual

◆ v_SolveLinearSystem()

void Nektar::MultiRegions::GlobalLinSysPETSc::v_SolveLinearSystem ( const int  pNumRows,
const Array< OneD, const NekDouble > &  pInput,
Array< OneD, NekDouble > &  pOutput,
const AssemblyMapSharedPtr locToGloMap,
const int  pNumDir 
)
overrideprotectedvirtual

Solve linear system using PETSc.

The general strategy being a PETSc solve is to:

  • Copy values into the PETSc vector m_b
  • Solve the system m_ksp and place result into m_x.
  • Scatter results back into m_locVec using m_ctx scatter object.
  • Copy from m_locVec to output array #pOutput.

Implements Nektar::MultiRegions::GlobalLinSys.

Reimplemented in Nektar::MultiRegions::GlobalLinSysPETScStaticCond.

Definition at line 139 of file GlobalLinSysPETSc.cpp.

143{
144 const int nHomDofs = pNumRows - pNumDir;
145
147 {
148 m_precon = CreatePrecon(locToGloMap);
149 m_precon->BuildPreconditioner();
150 }
151
152 Array<OneD, NekDouble> Glo(pNumRows);
153 locToGloMap->Assemble(pInput, Glo);
154
155 // Populate RHS vector from input
156 VecSetValues(m_b, nHomDofs, &m_reorderedMap[0], &Glo[pNumDir],
157 INSERT_VALUES);
158
159 // Assemble RHS vector
160 VecAssemblyBegin(m_b);
161 VecAssemblyEnd(m_b);
162
163 // Do system solve
164 KSPSolve(m_ksp, m_b, m_x);
165
166 KSPConvergedReason reason;
167 KSPGetConvergedReason(m_ksp, &reason);
168 ASSERTL0(reason > 0, "PETSc solver diverged, reason is: " +
169 std::string(KSPConvergedReasons[reason]));
170
171 // Scatter results to local vector
172 VecScatterBegin(m_ctx, m_x, m_locVec, INSERT_VALUES, SCATTER_FORWARD);
173 VecScatterEnd(m_ctx, m_x, m_locVec, INSERT_VALUES, SCATTER_FORWARD);
174
175 // Copy results into output vector
176 PetscScalar *tmp;
177 VecGetArray(m_locVec, &tmp);
178 Vmath::Vcopy(nHomDofs, tmp, 1, &Glo[pNumDir], 1);
179 Vmath::Zero(pNumDir, Glo, 1);
180 locToGloMap->GlobalToLocal(Glo, pOutput);
181 VecRestoreArray(m_locVec, &tmp);
182}
PreconditionerSharedPtr CreatePrecon(AssemblyMapSharedPtr asmMap)
Create a preconditioner object from the parameters defined in the supplied assembly map.
void Zero(int n, T *x, const int incx)
Zero vector.
Definition: Vmath.hpp:273

References ASSERTL0, Nektar::MultiRegions::GlobalLinSys::CreatePrecon(), Nektar::MultiRegions::ePETScMatMultShell, m_b, m_ctx, m_ksp, m_locVec, m_matMult, m_precon, m_reorderedMap, m_x, Vmath::Vcopy(), and Vmath::Zero().

Member Data Documentation

◆ m_b

Vec Nektar::MultiRegions::GlobalLinSysPETSc::m_b
protected

◆ m_ctx

VecScatter Nektar::MultiRegions::GlobalLinSysPETSc::m_ctx
protected

PETSc scatter context that takes us between Nektar++ global ordering and PETSc vector ordering.

Definition at line 82 of file GlobalLinSysPETSc.h.

Referenced by DoNekppOperation(), SetUpScatter(), v_SolveLinearSystem(), and Nektar::MultiRegions::GlobalLinSysPETScStaticCond::v_SolveLinearSystem().

◆ m_ksp

KSP Nektar::MultiRegions::GlobalLinSysPETSc::m_ksp
protected

KSP object that represents solver system.

Definition at line 72 of file GlobalLinSysPETSc.h.

Referenced by SetUpSolver(), v_SolveLinearSystem(), Nektar::MultiRegions::GlobalLinSysPETScStaticCond::v_SolveLinearSystem(), and ~GlobalLinSysPETSc().

◆ m_locVec

Vec Nektar::MultiRegions::GlobalLinSysPETSc::m_locVec
protected

◆ m_matMult

PETScMatMult Nektar::MultiRegions::GlobalLinSysPETSc::m_matMult
protected

◆ m_matrix

Mat Nektar::MultiRegions::GlobalLinSysPETSc::m_matrix
protected

◆ m_nLocal

int Nektar::MultiRegions::GlobalLinSysPETSc::m_nLocal
protected

Number of unique degrees of freedom on this process.

Definition at line 84 of file GlobalLinSysPETSc.h.

Referenced by CalculateReordering(), and SetUpMatVec().

◆ m_pc

PC Nektar::MultiRegions::GlobalLinSysPETSc::m_pc
protected

PCShell for preconditioner.

Definition at line 74 of file GlobalLinSysPETSc.h.

Referenced by SetUpMatVec(), SetUpSolver(), and ~GlobalLinSysPETSc().

◆ m_precon

PreconditionerSharedPtr Nektar::MultiRegions::GlobalLinSysPETSc::m_precon
protected

◆ m_reorderedMap

std::vector<int> Nektar::MultiRegions::GlobalLinSysPETSc::m_reorderedMap
protected

◆ m_x

Vec Nektar::MultiRegions::GlobalLinSysPETSc::m_x
protected

◆ matMult

std::string Nektar::MultiRegions::GlobalLinSysPETSc::matMult
staticprivate
Initial value:
=
"Sparse")
static std::string RegisterDefaultSolverInfo(const std::string &pName, const std::string &pValue)
Registers the default string value of a solver info property.

Definition at line 128 of file GlobalLinSysPETSc.h.

◆ matMultIds

std::string Nektar::MultiRegions::GlobalLinSysPETSc::matMultIds
staticprivate
Initial value:
= {
"PETScMatMult", "Sparse", MultiRegions::ePETScMatMultSparse),
"PETScMatMult", "Shell", MultiRegions::ePETScMatMultShell)}
static std::string RegisterEnumValue(std::string pEnum, std::string pString, int pEnumValue)
Registers an enumeration value.

Definition at line 129 of file GlobalLinSysPETSc.h.