Hundreds of physicists analyze data collected by the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider using the CMS Remote Analysis Builder and the CMS global pool to exploit the resources of the Worldwide LHC Computing Grid. Efficient use of such an extensive and expensive resource is crucial. At the same time, the CMS collaboration is committed to minimizing time to insight for every scientist, by pushing for fewer possible access restrictions to the full data sample and supports the free choice of applications to run on the computing resources. Supporting such variety of workflows while preserving efficient resource usage poses special challenges. In this paper we report on three complementary approaches adopted in CMS...
The challenges expected for the next era of the Large Hadron Collider (LHC), both in terms of storag...
During the first two years of data taking, the CMS experiment has collected over 20 PetaBytes of dat...
CMS is one of the two general-purpose HEP experiments currently under construction for the Large Had...
Hundreds of physicists analyze data collected by the Compact Muon Solenoid (CMS) experiment at the L...
CRAB3 is a workload management tool used by CMS physicists to analyze data acquired by the Compact M...
Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resourc...
At the Large Hadron Collider (LHC), more than 30 petabytes of data are produced from particle collis...
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, dist...
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, dist...
The CMS collaboration is undertaking a big effort to define the analysis model and to develop softwa...
The CMS collaboration is undertaking a big effort to define the analysis model and to develop softwa...
The chain of the typical CMS analysis workflow execution starts once configured and submitted by the...
The Compact Muon Solenoid Experiment (CMS) at the Large Hadron Collider (LHC) at CERN has brilliant ...
During the first two years of data taking, the CMS experiment has collected over 20 PetaBytes of dat...
While a majority of CMS data analysis activities rely on the distributed computing infrastructure on...
The challenges expected for the next era of the Large Hadron Collider (LHC), both in terms of storag...
During the first two years of data taking, the CMS experiment has collected over 20 PetaBytes of dat...
CMS is one of the two general-purpose HEP experiments currently under construction for the Large Had...
Hundreds of physicists analyze data collected by the Compact Muon Solenoid (CMS) experiment at the L...
CRAB3 is a workload management tool used by CMS physicists to analyze data acquired by the Compact M...
Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resourc...
At the Large Hadron Collider (LHC), more than 30 petabytes of data are produced from particle collis...
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, dist...
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, dist...
The CMS collaboration is undertaking a big effort to define the analysis model and to develop softwa...
The CMS collaboration is undertaking a big effort to define the analysis model and to develop softwa...
The chain of the typical CMS analysis workflow execution starts once configured and submitted by the...
The Compact Muon Solenoid Experiment (CMS) at the Large Hadron Collider (LHC) at CERN has brilliant ...
During the first two years of data taking, the CMS experiment has collected over 20 PetaBytes of dat...
While a majority of CMS data analysis activities rely on the distributed computing infrastructure on...
The challenges expected for the next era of the Large Hadron Collider (LHC), both in terms of storag...
During the first two years of data taking, the CMS experiment has collected over 20 PetaBytes of dat...
CMS is one of the two general-purpose HEP experiments currently under construction for the Large Had...