In this paper we describe a compiler framework which can identify communication patterns for MPI-based parallel applications. This has the potential of providing significant performance benefits when connec-tions can be established in the network prior to the ac-tual communication operation. Our compiler uses a flexible and powerful communication pattern represen-tation scheme that can capture the property of com-munication patterns and allows manipulations of these patterns. In this way, communication phases can be detected and logically separated within the application. Additionally, we extend the classification of static and dynamic communication patterns and operations to in-clude persistent communications. Persistent communi-cations ap...
In this paper we present simulation algorithms that characterize the main sources of communication g...
Massively parallel computers (MPC) are characterized by the distribution of memory among an ensemble...
Abstract. Large-scale parallel data analysis, where global information from a variety of problem dom...
In this paper we describe a compiler framework which can identify communication patterns for MPIbase...
Technical advances have brought circuit switching back to the stage of interconnection network desig...
In this paper, we investigate the communication characteristics of the Message Passing Interface (MP...
AbstractThis paper deals with a technique that can support the re-engineering of parallel programs b...
Technical advances have brought circuit switching back to the stage of interconnection network desig...
This work was also published as a Rice University thesis/dissertation: http://hdl.handle.net/1911/19...
Abstract—Message passing is a very popular style of parallel programming, used in a wide variety of ...
MPI is widely used for programming large HPC clusters. MPI also includes persistent operations, whic...
Data-parallel languages allow programmers to use the familiar machine-independent programming style ...
Communication coalescing is a static optimization that can reduce both communication frequency and r...
Many scientific applications are iterative and specify repetitive communication patterns. This paper...
On most massively parallel architectures, the actual communication performance remains much less tha...
In this paper we present simulation algorithms that characterize the main sources of communication g...
Massively parallel computers (MPC) are characterized by the distribution of memory among an ensemble...
Abstract. Large-scale parallel data analysis, where global information from a variety of problem dom...
In this paper we describe a compiler framework which can identify communication patterns for MPIbase...
Technical advances have brought circuit switching back to the stage of interconnection network desig...
In this paper, we investigate the communication characteristics of the Message Passing Interface (MP...
AbstractThis paper deals with a technique that can support the re-engineering of parallel programs b...
Technical advances have brought circuit switching back to the stage of interconnection network desig...
This work was also published as a Rice University thesis/dissertation: http://hdl.handle.net/1911/19...
Abstract—Message passing is a very popular style of parallel programming, used in a wide variety of ...
MPI is widely used for programming large HPC clusters. MPI also includes persistent operations, whic...
Data-parallel languages allow programmers to use the familiar machine-independent programming style ...
Communication coalescing is a static optimization that can reduce both communication frequency and r...
Many scientific applications are iterative and specify repetitive communication patterns. This paper...
On most massively parallel architectures, the actual communication performance remains much less tha...
In this paper we present simulation algorithms that characterize the main sources of communication g...
Massively parallel computers (MPC) are characterized by the distribution of memory among an ensemble...
Abstract. Large-scale parallel data analysis, where global information from a variety of problem dom...