Bringing a high-dimensional dataset into science-ready shape is a formidable challenge that often necessitates data compression. Compression has accordingly become a key consideration for contemporary cosmology, affecting public data releases, and reanalyses searching for new physics. However, data compression optimized for a particular model can suppress signs of new physics, or even remove them altogether. We therefore provide a solution for exploring new physics \emph{during} data compression. In particular, we store additional agnostic compressed data points, selected to enable precise constraints of non-standard physics at a later date. Our procedure is based on the maximal compression of the MOPED algorithm, which optimally filters th...
Today\u27s scientific simulations require a significant reduction of the data size because of extrem...
7 pages, 4 colour figures, EmulateApJ; v2: includes Bayesian evidence analysis and table that were o...
The leading modern theories of cosmological inflation are increasingly multi-dimensional. The “inflato...
Today\u27s N-body simulations are producing extremely large amounts of data. The Hardware/Hybrid Acc...
We apply two compression methods to the galaxy power spectrum monopole/quadrupole and bispectrum mo...
International audienceThe next generation of research experiments will introduce a huge data surge t...
The work presented in this thesis focuses on developing compression techniques to exploit fully the ...
Future surveys of the Universe face the dual challenge of data size and data statistics. The non-Gau...
Context. Future large scale cosmological surveys will provide huge data sets whose analysi...
Cosmological simulations may produce extremely large amount of data, such that its successful run de...
We present a method for radical linear compression of datasets where the data are dependent on some ...
International audienceMany statistical models in cosmology can be simulated forwards but have intrac...
For the last 5 years Accelogic pioneered and perfected a radically new theory of numerical computing...
We show how the massive data compression algorithm MOPED can be used to reduce, by orders of magnitu...
Today\u27s scientific simulations require a significant reduction of the data size because of extrem...
7 pages, 4 colour figures, EmulateApJ; v2: includes Bayesian evidence analysis and table that were o...
The leading modern theories of cosmological inflation are increasingly multi-dimensional. The “inflato...
Today\u27s N-body simulations are producing extremely large amounts of data. The Hardware/Hybrid Acc...
We apply two compression methods to the galaxy power spectrum monopole/quadrupole and bispectrum mo...
International audienceThe next generation of research experiments will introduce a huge data surge t...
The work presented in this thesis focuses on developing compression techniques to exploit fully the ...
Future surveys of the Universe face the dual challenge of data size and data statistics. The non-Gau...
Context. Future large scale cosmological surveys will provide huge data sets whose analysi...
Cosmological simulations may produce extremely large amount of data, such that its successful run de...
We present a method for radical linear compression of datasets where the data are dependent on some ...
International audienceMany statistical models in cosmology can be simulated forwards but have intrac...
For the last 5 years Accelogic pioneered and perfected a radically new theory of numerical computing...
We show how the massive data compression algorithm MOPED can be used to reduce, by orders of magnitu...
Today\u27s scientific simulations require a significant reduction of the data size because of extrem...
7 pages, 4 colour figures, EmulateApJ; v2: includes Bayesian evidence analysis and table that were o...
The leading modern theories of cosmological inflation are increasingly multi-dimensional. The “inflato...