Various discussions relating to computers comment on a reasonable extent of random access memory (RAM) increase as it is a known fact that the extention of this type of memory influences speed of computer machines. Disputes often arise as to whether half a gigabyte extension of RAM is large enough for the computers to be significantly sped up, given the complexity of present software applications. In this article, we test statistically whether such an increase speeds up computers significantly or not, using analysis of covariance as a suitable statistical tool
Many situations call for an estimation of the execution time of applications, e.g., during design or...
12 pagesThe community of program optimisation and analysis, code performance evaluation, parallelisa...
If cluster C1 consists of computers with a faster mean speed than the computers in cluster C2, does ...
Various discussions relating to computers comment on a reasonable extent of random access memory (RA...
Measuring performance and quantifying a performance change are core evaluation techniques in program...
In the above raport the usage of the statistical methods to predict the efficiency of the parallel a...
Article first published online: 15 OCT 2012International audienceIn the area of high performance com...
Most software is contructed on the assumption that the programs and data are stored in random access...
In the area of high performance computing and embedded systems, numerous code optimisation methods e...
Performance testing is a mean used to evaluate speed of software projects. In an ideal state a proje...
Characterizing static random access memories (SRAMs) is difficult but necessary to understandits pro...
Although parallel computers have existed for many years, recently there has been a surge of academic...
AbstractWe construct a two-sample test for comparison of long memory parameters based on ratios of t...
A software is included with the document: the software implements the speedup-test protocole.Numerou...
The problem of learning parallel computer performance is investigated in the context of multicore pr...
Many situations call for an estimation of the execution time of applications, e.g., during design or...
12 pagesThe community of program optimisation and analysis, code performance evaluation, parallelisa...
If cluster C1 consists of computers with a faster mean speed than the computers in cluster C2, does ...
Various discussions relating to computers comment on a reasonable extent of random access memory (RA...
Measuring performance and quantifying a performance change are core evaluation techniques in program...
In the above raport the usage of the statistical methods to predict the efficiency of the parallel a...
Article first published online: 15 OCT 2012International audienceIn the area of high performance com...
Most software is contructed on the assumption that the programs and data are stored in random access...
In the area of high performance computing and embedded systems, numerous code optimisation methods e...
Performance testing is a mean used to evaluate speed of software projects. In an ideal state a proje...
Characterizing static random access memories (SRAMs) is difficult but necessary to understandits pro...
Although parallel computers have existed for many years, recently there has been a surge of academic...
AbstractWe construct a two-sample test for comparison of long memory parameters based on ratios of t...
A software is included with the document: the software implements the speedup-test protocole.Numerou...
The problem of learning parallel computer performance is investigated in the context of multicore pr...
Many situations call for an estimation of the execution time of applications, e.g., during design or...
12 pagesThe community of program optimisation and analysis, code performance evaluation, parallelisa...
If cluster C1 consists of computers with a faster mean speed than the computers in cluster C2, does ...