Ensuring the long-term reproducibility of data analyses requires results stability tests to verify that analysis results remain within acceptable variation bounds despite inevitable software updates and hardware evolutions. This paper introduces a numerical variability approach for results stability tests, which determines acceptable variation bounds using random rounding of floating-point calculations. By applying the resulting stability test to \fmriprep, a widely-used neuroimaging tool, we show that the test is sensitive enough to detect subtle updates in image processing methods while remaining specific enough to accept numerical variations within a reference version of the application. This result contributes to enhancing the reliabili...
Neuroimaging software methods are complex, making it a near certainty that some implementations will...
In neuroscience, the localisation of task associated brain activation is an essential topological co...
AbstractAt present, functional magnetic resonance imaging (fMRI) is one of the most useful methods o...
International audienceWith an increase in awareness regarding a troubling lack of reproducibility in...
Variations in computational infrastructures, including operating systems, software versions, and har...
Reproducibility is imperative for any scientific discovery. More often than not, modern scientific f...
The analysis of brain-imaging data requires complex processing pipelines to support findings on brai...
International audienceWhen changes in the analysis methods lead to different results, what does it t...
International audienceThe analysis of brain-imaging data requires complex processing pipelines to su...
Background: Carp (2012) demonstrated the large variability that is present in the method sections of...
We investigate the impact of decisions in the second-level (i.e. over subjects) inferential process ...
Data analysis workflows in many scientific domains have become increasingly complex and flexible. To...
Neurological imaging has become increasingly important in the field of psychological research. The l...
Neuroimaging software methods are complex, making it a near certainty that some implementations will...
In neuroscience, the localisation of task associated brain activation is an essential topological co...
AbstractAt present, functional magnetic resonance imaging (fMRI) is one of the most useful methods o...
International audienceWith an increase in awareness regarding a troubling lack of reproducibility in...
Variations in computational infrastructures, including operating systems, software versions, and har...
Reproducibility is imperative for any scientific discovery. More often than not, modern scientific f...
The analysis of brain-imaging data requires complex processing pipelines to support findings on brai...
International audienceWhen changes in the analysis methods lead to different results, what does it t...
International audienceThe analysis of brain-imaging data requires complex processing pipelines to su...
Background: Carp (2012) demonstrated the large variability that is present in the method sections of...
We investigate the impact of decisions in the second-level (i.e. over subjects) inferential process ...
Data analysis workflows in many scientific domains have become increasingly complex and flexible. To...
Neurological imaging has become increasingly important in the field of psychological research. The l...
Neuroimaging software methods are complex, making it a near certainty that some implementations will...
In neuroscience, the localisation of task associated brain activation is an essential topological co...
AbstractAt present, functional magnetic resonance imaging (fMRI) is one of the most useful methods o...