Duhem’s problem arises especially in scientific contexts where the tools and procedures of measurement and analysis are numerous and complex. Several philosophers of cognitive science have cited its manifestations in fMRI as grounds for skepticism regarding the epistemic value of neuroimaging. To address these Duhemian arguments for skepticism, I offer an alternative approach based on Deborah Mayo’s error-statistical account in which Duhem's problem is more fruitfully approached in terms of error probabilities. This is illustrated in examples such as the use of probabilistic brain atlases, comparison of different preprocessing protocols with respect to their error characteristics, and statistical modeling of fMRI data. These examples demons...