The objective of this work is to compare the performance of three common environments for supporting parallel computing, by focusing on the impact of communications on this performance. In particular, the studied environments include a shared memory multiprocessor, a distributed memory multiprocessor, and an ATM-based network of workstations. The measurements consist of the execution of real parallel codes, obtained from well-known benchmark suites. Our results show that the characteristics of communications in each parallel code significantly determine its best-suited parallel computing environment. Competitive performance can be achieved on the ATM-based environment provided that some bottlenecks in the network and the hosts are reduced t...
There are several benchmark programs available to measure the performance of MPI on parallel comput...
Parallel computing is essential for solving very large scientific and engineering problems. An effec...
In terms of facilities for communications and synchronization in parallel programs, the descriptive ...
In this paper we investigate some of the important factors which affect the message-passing performa...
In this paper we investigate some of the important factors which affect the message-passing performa...
A case study was conducted to examine the performance and portability of parallel applications, with...
. In this paper, we describe experiments comparing the communication times for a number of different...
Cluster-based computing, which exploits the aggregate power of networked collections of computers, h...
The original publication can be found at www.springerlink.comThis paper gives an overview of two rel...
In this paper, we examine and characterize effects of communication interactions of parallel and se...
We compare two paradigms for parallel programming on networks of workstations: message passing and d...
In distributed memory multicomputers, synchronization and data sharing are achieved by explicit mess...
A benchmark test using the Message Passing Interface (MPI, an emerging standard for writing message ...
Clusters of workstations are a popular platform for high-performance computing. For many parallel ap...
The advent of high-speed local and wide area networks has made geographically distributed NOWs an ap...
There are several benchmark programs available to measure the performance of MPI on parallel comput...
Parallel computing is essential for solving very large scientific and engineering problems. An effec...
In terms of facilities for communications and synchronization in parallel programs, the descriptive ...
In this paper we investigate some of the important factors which affect the message-passing performa...
In this paper we investigate some of the important factors which affect the message-passing performa...
A case study was conducted to examine the performance and portability of parallel applications, with...
. In this paper, we describe experiments comparing the communication times for a number of different...
Cluster-based computing, which exploits the aggregate power of networked collections of computers, h...
The original publication can be found at www.springerlink.comThis paper gives an overview of two rel...
In this paper, we examine and characterize effects of communication interactions of parallel and se...
We compare two paradigms for parallel programming on networks of workstations: message passing and d...
In distributed memory multicomputers, synchronization and data sharing are achieved by explicit mess...
A benchmark test using the Message Passing Interface (MPI, an emerging standard for writing message ...
Clusters of workstations are a popular platform for high-performance computing. For many parallel ap...
The advent of high-speed local and wide area networks has made geographically distributed NOWs an ap...
There are several benchmark programs available to measure the performance of MPI on parallel comput...
Parallel computing is essential for solving very large scientific and engineering problems. An effec...
In terms of facilities for communications and synchronization in parallel programs, the descriptive ...