Discontinuous Galerkin Library
#include "dg/algorithm.h"
dg::MPI_Vector< container > Struct Template Reference

mpi Vector class More...

Public Types

typedef container container_type
 

Public Member Functions

 MPI_Vector ()
 no data is allocated, communicators are MPI_COMM_NULL More...
 
 MPI_Vector (const container &data, MPI_Comm comm)
 construct a vector More...
 
template<class OtherContainer >
 MPI_Vector (const MPI_Vector< OtherContainer > &src)
 Conversion operator. More...
 
const container & data () const
 Get underlying data. More...
 
container & data ()
 Set underlying data. More...
 
MPI_Comm communicator () const
 Get the communicator to which this vector belongs. More...
 
MPI_Comm communicator_mod () const
 Returns a communicator of fixed size 128. More...
 
MPI_Comm communicator_mod_reduce () const
 Returns a communicator consisting of all processes with rank 0 in communicator_mod() More...
 
void set_communicator (MPI_Comm comm, MPI_Comm comm_mod, MPI_Comm comm_mod_reduce)
 Set the communicators with dg::exblas::mpi_reduce_communicator. More...
 
unsigned size () const
 Return the size of the data object. More...
 
void swap (MPI_Vector &src)
 Swap data and communicator. More...
 

Detailed Description

template<class container>
struct dg::MPI_Vector< container >

mpi Vector class

This class is a simple wrapper around a container object and an MPI_Comm. The blas1 and blas2 functionality is available iff it is available for the container type. We use mpi to communicate (e.g. boundary points in matrix-vector multiplications) and use the existing blas functions for the local computations. (At the blas level 1 communication is needed only for scalar products)

Template Parameters
containerlocal container type. Must have a size() and a swap() member function and a specialization of the TensorTraits class.

Member Typedef Documentation

◆ container_type

template<class container >
typedef container dg::MPI_Vector< container >::container_type

typedef to acces underlying container

Constructor & Destructor Documentation

◆ MPI_Vector() [1/3]

template<class container >
dg::MPI_Vector< container >::MPI_Vector ( )
inline

no data is allocated, communicators are MPI_COMM_NULL

◆ MPI_Vector() [2/3]

template<class container >
dg::MPI_Vector< container >::MPI_Vector ( const container &  data,
MPI_Comm  comm 
)
inline

construct a vector

calls dg::exblas::mpi_reduce_communicator() (collective call)

Parameters
datainternal data copy
commMPI communicator (may not be MPI_COMM_NULL)

◆ MPI_Vector() [3/3]

template<class container >
template<class OtherContainer >
dg::MPI_Vector< container >::MPI_Vector ( const MPI_Vector< OtherContainer > &  src)
inline

Conversion operator.

uses conversion between compatible containers

Template Parameters
OtherContaineranother container class (container must be copy constructible from OtherContainer)
Parameters
srcthe source

Member Function Documentation

◆ communicator()

template<class container >
MPI_Comm dg::MPI_Vector< container >::communicator ( ) const
inline

Get the communicator to which this vector belongs.

Returns
read access to MPI communicator

◆ communicator_mod()

template<class container >
MPI_Comm dg::MPI_Vector< container >::communicator_mod ( ) const
inline

Returns a communicator of fixed size 128.

◆ communicator_mod_reduce()

template<class container >
MPI_Comm dg::MPI_Vector< container >::communicator_mod_reduce ( ) const
inline

Returns a communicator consisting of all processes with rank 0 in communicator_mod()

Returns
returns MPI_COMM_NULL to processes not part of that group

◆ data() [1/2]

template<class container >
container & dg::MPI_Vector< container >::data ( )
inline

Set underlying data.

Returns
write access to data

◆ data() [2/2]

template<class container >
const container & dg::MPI_Vector< container >::data ( ) const
inline

Get underlying data.

Returns
read access to data

◆ set_communicator()

template<class container >
void dg::MPI_Vector< container >::set_communicator ( MPI_Comm  comm,
MPI_Comm  comm_mod,
MPI_Comm  comm_mod_reduce 
)
inline

Set the communicators with dg::exblas::mpi_reduce_communicator.

The reason why you can't just set the comm and need three parameters is that generating communicators involves communication, which you might want to avoid when you do it many times. So you have to call the function as

MPI_Comm comm = MPI_COMM_WORLD, comm_mod, comm_mod_reduce;
dg::exblas::mpi_reduce_communicator( comm, &comm_mod, &comm_mod_reduce);
mpi_vector.set_communicator( comm, comm_mod, comm_mod_reduce);
static void mpi_reduce_communicator(MPI_Comm comm, MPI_Comm *comm_mod, MPI_Comm *comm_mod_reduce)

◆ size()

template<class container >
unsigned dg::MPI_Vector< container >::size ( ) const
inline

Return the size of the data object.

Returns
data.size()

◆ swap()

template<class container >
void dg::MPI_Vector< container >::swap ( MPI_Vector< container > &  src)
inline

Swap data and communicator.

Parameters
srccommunicator and data is swapped

The documentation for this struct was generated from the following file: