Many parallel applications do not completely fit into the data parallel model. Although these applications contain data parallelism, task parallelism is needed to represent the natural computation structure or enhance performance. To combine the easiness of programming of the data parallel model with the efficiency of the task parallel model allows to parallel forms to be nested, giving Nested parallelism. In this work, we examine the solutions provided to N ested parallelism in two standard parallel programming platforms, HPF and MPI. Both their expression capacity and their efficiency are compared on a Cray- 3TE, which is distributed memory machine. Finally, an additional speech about the use of the methodology proposed for MPI is done on...
Institute for Computing Systems ArchitectureThe programming of parallel computers is recognised as b...
Over the last few decades, Message Passing Interface (MPI) has become the parallel-communication sta...
Many parallel applications from scientific computing use MPI collective communication operations to ...
Many parallel applications do not completely fit into the data parallel model. Although these applic...
Data parallelislm is one of the more successful efforts to introduce explicit parallelism to high le...
High Performance Fortran (HPF) has emerged as a standard dialect of Fortran for data-parallel comput...
The data-parallel language High Performance Fortran (HPF) does not allow efficient expression of mix...
High Performance Fortran (HPF) does not allow ecient expression of mixed task/data-parallel computat...
It has become common knowledge that parallel programming is needed for scientific applications, part...
Data-parallel languages such as High Performance Fortran (HPF) present a simple execution model in w...
Pure data-parallel languages such as High Performance Fortran version 1 (HPF) do not allow efficient...
Distributed Memory Multicomputers (DMMs) such as the IBM SP-2, the Intel Paragon and the Thinking Ma...
Among parallel programming tools, task parallelism and data parallelism are the most common programm...
The majority of current HPC applications are composed of complex and irregular data structures that ...
Data-parallel languages such as High Performance Fortran (HPF) present a simple execution model in w...
Institute for Computing Systems ArchitectureThe programming of parallel computers is recognised as b...
Over the last few decades, Message Passing Interface (MPI) has become the parallel-communication sta...
Many parallel applications from scientific computing use MPI collective communication operations to ...
Many parallel applications do not completely fit into the data parallel model. Although these applic...
Data parallelislm is one of the more successful efforts to introduce explicit parallelism to high le...
High Performance Fortran (HPF) has emerged as a standard dialect of Fortran for data-parallel comput...
The data-parallel language High Performance Fortran (HPF) does not allow efficient expression of mix...
High Performance Fortran (HPF) does not allow ecient expression of mixed task/data-parallel computat...
It has become common knowledge that parallel programming is needed for scientific applications, part...
Data-parallel languages such as High Performance Fortran (HPF) present a simple execution model in w...
Pure data-parallel languages such as High Performance Fortran version 1 (HPF) do not allow efficient...
Distributed Memory Multicomputers (DMMs) such as the IBM SP-2, the Intel Paragon and the Thinking Ma...
Among parallel programming tools, task parallelism and data parallelism are the most common programm...
The majority of current HPC applications are composed of complex and irregular data structures that ...
Data-parallel languages such as High Performance Fortran (HPF) present a simple execution model in w...
Institute for Computing Systems ArchitectureThe programming of parallel computers is recognised as b...
Over the last few decades, Message Passing Interface (MPI) has become the parallel-communication sta...
Many parallel applications from scientific computing use MPI collective communication operations to ...