This project investigated layout and compression techniques for large, unstructured simulation data to reduce bandwidth requirements and latency in simulation I/O and subsequent post-processing, e.g. data analysis and visualization. The main goal was to eliminate the data-transfer bottleneck - for example, from disk to memory and from central processing unit to graphics processing unit - through coherent data access and by trading underutilized compute power for effective bandwidth and storage. This was accomplished by (1) designing algorithms that both enforce and exploit compactness and locality in unstructured data, and (2) adapting offline computations to a novel stream processing framework that supports pipelining and low-latency seque...
Large scale simulations easily produce vast amounts of data that cannot always be evaluated in-situ....
The post-processing tool POST has been developed at the DLR Institute of Propulsion Technology speci...
The analysis of large unstructured or meshfree data is challenging due to their sheer size and unorg...
Simulation models can generate tremendous quantities of data. This paper describes data compression ...
Abstract—Increasing number of cores in parallel computer systems are allowing scientific simulations...
Numerical simulations produce large amounts of data. A key challenge is how to cope with the resulti...
Research within the physical sciences is becoming increasingly dependent on the ability to create co...
Recent improvements in computational capability have given scientists increased ability to simulate ...
Great advancements in commodity graphics hardware have favoured graphics processing unit (GPU)-based...
Because of the ever-increasing data being produced by today\u27s high performance computing (HPC) sc...
Large-scale numerical simulations of high-intensity focused ultrasound (HIFU), important for model-b...
The objective of data compression is to avoid redundancy in order to reduce the size of the data to ...
University of Minnesota Ph.D. dissertation.November 2017. Major: Computer Science. Advisor: George ...
Figure 1: Frames from real-time rendering of animated supernova data set (4323×60, float- 18GB), com...
High-Performance Computing (HPC) systems provide input/output (IO) performance growing relatively sl...
Large scale simulations easily produce vast amounts of data that cannot always be evaluated in-situ....
The post-processing tool POST has been developed at the DLR Institute of Propulsion Technology speci...
The analysis of large unstructured or meshfree data is challenging due to their sheer size and unorg...
Simulation models can generate tremendous quantities of data. This paper describes data compression ...
Abstract—Increasing number of cores in parallel computer systems are allowing scientific simulations...
Numerical simulations produce large amounts of data. A key challenge is how to cope with the resulti...
Research within the physical sciences is becoming increasingly dependent on the ability to create co...
Recent improvements in computational capability have given scientists increased ability to simulate ...
Great advancements in commodity graphics hardware have favoured graphics processing unit (GPU)-based...
Because of the ever-increasing data being produced by today\u27s high performance computing (HPC) sc...
Large-scale numerical simulations of high-intensity focused ultrasound (HIFU), important for model-b...
The objective of data compression is to avoid redundancy in order to reduce the size of the data to ...
University of Minnesota Ph.D. dissertation.November 2017. Major: Computer Science. Advisor: George ...
Figure 1: Frames from real-time rendering of animated supernova data set (4323×60, float- 18GB), com...
High-Performance Computing (HPC) systems provide input/output (IO) performance growing relatively sl...
Large scale simulations easily produce vast amounts of data that cannot always be evaluated in-situ....
The post-processing tool POST has been developed at the DLR Institute of Propulsion Technology speci...
The analysis of large unstructured or meshfree data is challenging due to their sheer size and unorg...