This paper examines whether lossy compression can be used effectively in physics simulations as a possible strategy to combat the expected data-movement bottleneck in future high performance computing architectures. We show that, for the codes and simulations we tested, compression levels of 3–5X can be applied without causing significant changes to important physical quantities. Rather than applying signal processing error metrics, we utilize physics-based metrics appropriate for each code to assess the impact of compression. We evaluate three different simulation codes: a Lagrangian shock-hydrodynamics code, an Eulerian higher-order hydrodynamics turbulence modeling code, and an Eulerian coupled laser-plasma interaction code. We compress ...
For the last 5 years Accelogic pioneered and perfected a radically new theory of numerical computing...
The volume of data and the velocity with which it is being generated by com-putational experiments o...
In recent years, the gap between bandwidth and computational throughput has become a major challenge...
Computational fluid dynamic simulations involve large state data, leading to performance degradation...
Because of the ever-increasing data being produced by today\u27s high performance computing (HPC) sc...
An effective data compressor is becoming increasingly critical to today\u27s scientific research, an...
Architectural and technological trends of systems used for scientific computing call for a significa...
Extremely large scale scientific simulation applications have been very important in many scientific...
Today\u27s N-body simulations are producing extremely large amounts of data. The Hardware/Hybrid Acc...
Cosmological simulations may produce extremely large amount of data, such that its successful run de...
Applications in scientific computing operate with high-volume numerical data and the occupied space ...
Today\u27s scientific simulations require a significant reduction of the data size because of extrem...
Abstract—Increasing number of cores in parallel computer systems are allowing scientific simulations...
Today's scientific simulations are producing vast volumes of data that cannot be stored and transfer...
Molecular dynamics (MD) has been widely used in today\u27s scientific research across multiple domai...
For the last 5 years Accelogic pioneered and perfected a radically new theory of numerical computing...
The volume of data and the velocity with which it is being generated by com-putational experiments o...
In recent years, the gap between bandwidth and computational throughput has become a major challenge...
Computational fluid dynamic simulations involve large state data, leading to performance degradation...
Because of the ever-increasing data being produced by today\u27s high performance computing (HPC) sc...
An effective data compressor is becoming increasingly critical to today\u27s scientific research, an...
Architectural and technological trends of systems used for scientific computing call for a significa...
Extremely large scale scientific simulation applications have been very important in many scientific...
Today\u27s N-body simulations are producing extremely large amounts of data. The Hardware/Hybrid Acc...
Cosmological simulations may produce extremely large amount of data, such that its successful run de...
Applications in scientific computing operate with high-volume numerical data and the occupied space ...
Today\u27s scientific simulations require a significant reduction of the data size because of extrem...
Abstract—Increasing number of cores in parallel computer systems are allowing scientific simulations...
Today's scientific simulations are producing vast volumes of data that cannot be stored and transfer...
Molecular dynamics (MD) has been widely used in today\u27s scientific research across multiple domai...
For the last 5 years Accelogic pioneered and perfected a radically new theory of numerical computing...
The volume of data and the velocity with which it is being generated by com-putational experiments o...
In recent years, the gap between bandwidth and computational throughput has become a major challenge...