With the advent of massively parallel computer systems, scientists are now able to simulate complex phenomena (e.g., explosions of a stars). Such scientific simulations typically generate large-scale data sets over the spatio-temporal space. Unfortunately, the sheer sizes of the generated data sets make efficient exploration of them impossible. Constructing queriable statistical models is an essential step in helping scientists glean new insight from their computer simulations. We define queriable statistical models to be descriptive statistics that (1) summarize and describe the data within a user-defined modeling error, and (2) are able to answer complex range-based queries over the spatiotemporal dimensions. In this chapter, we describe ...
International audienceThe combination of high-performance computing towards Exascale power and numer...
Data science uses methods, processes, algorithms to extract knowledge and insights from structured a...
Better instruments, faster and bigger supercomputers and easier collaboration and sharing of data in...
With the advent of fast computer systems, scientists are now able to generate terabytes of simulatio...
As datasets grow beyond the gigabyte scale, there is an increasing demand to develop techniques for ...
We present our approach to enabling approximate ad hoc queries on terabyte-scale mesh data generated...
In this paper, we describe AQSim, an ongoing effort to design and implement a system to manage terab...
In this paper, we describe AQSim, an ongoing effort to design and implement a system to manage terab...
We present our approach to enabling approximate ad hoc queries on terabyte-scale mesh data generated...
This thesis proposes new analysis tools for simulation models in the presence of data. To achieve a ...
We introduce statistical techniques required to handle complex computer models with potential applic...
This dissertation explores Machine Learning in the context of computationally intensive simulations....
<p>Uncertainty quantification (UQ) is both an old and new concept. The current novelty lies in the ...
Modeling with Data fully explains how to execute computationally intensive analyses on very large da...
Modelling and simulation are widely considered essential for the analysis of complex systems and nat...
International audienceThe combination of high-performance computing towards Exascale power and numer...
Data science uses methods, processes, algorithms to extract knowledge and insights from structured a...
Better instruments, faster and bigger supercomputers and easier collaboration and sharing of data in...
With the advent of fast computer systems, scientists are now able to generate terabytes of simulatio...
As datasets grow beyond the gigabyte scale, there is an increasing demand to develop techniques for ...
We present our approach to enabling approximate ad hoc queries on terabyte-scale mesh data generated...
In this paper, we describe AQSim, an ongoing effort to design and implement a system to manage terab...
In this paper, we describe AQSim, an ongoing effort to design and implement a system to manage terab...
We present our approach to enabling approximate ad hoc queries on terabyte-scale mesh data generated...
This thesis proposes new analysis tools for simulation models in the presence of data. To achieve a ...
We introduce statistical techniques required to handle complex computer models with potential applic...
This dissertation explores Machine Learning in the context of computationally intensive simulations....
<p>Uncertainty quantification (UQ) is both an old and new concept. The current novelty lies in the ...
Modeling with Data fully explains how to execute computationally intensive analyses on very large da...
Modelling and simulation are widely considered essential for the analysis of complex systems and nat...
International audienceThe combination of high-performance computing towards Exascale power and numer...
Data science uses methods, processes, algorithms to extract knowledge and insights from structured a...
Better instruments, faster and bigger supercomputers and easier collaboration and sharing of data in...