The data model of the CMS experiment is outlined and the role of dedicated analysis software tools and the Physics Analysis Toolkit (PAT) therein are described. They support the standardization of common analyses operations like the association, combination or isolation of reconstructed objects in a user configurable way. They facilitate event content management and data access for the end-user sustaining the full flexibility of the CMS data model at the same time
The CMS Data Analysis School is an official event organized by the CMS Collaboration to teach studen...
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, dist...
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, dist...
The CMS Physics Analysis Toolkit (PAT) is presented. The PAT is a high-level analysis layer enabling...
The CMS experiment is expected to start data taking during 2008, and large data samples, of the Peta...
Since 2009, the CMS experiment at LHC has provided an intensive training on the use of Physics Analy...
The CMS Analysis Tools model has now been used robustly in a plethora of physics papers. This model ...
The CERN analysis preservation portal (CAP) comprises a set of tools and services aiming to assist r...
Analyzing physics data at LHC experiments is a complicated task involving multiple steps, sharing of...
The CERN analysis preservation portal (CAP) comprises a set of tools and services aiming to assist r...
The use of virtual data for enhancing the collaboration between large groups of scientists is explor...
To impart hands-on training in physics analysis, CMS experiment initiated the  concept of CMS Data ...
The use of virtual data for enhancing the collaboration between large groups of scientists is explor...
The CMS collaboration is undertaking a big effort to define the analysis model and to develop softwa...
The CMS collaboration is undertaking a big effort to define the analysis model and to develop softwa...
The CMS Data Analysis School is an official event organized by the CMS Collaboration to teach studen...
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, dist...
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, dist...
The CMS Physics Analysis Toolkit (PAT) is presented. The PAT is a high-level analysis layer enabling...
The CMS experiment is expected to start data taking during 2008, and large data samples, of the Peta...
Since 2009, the CMS experiment at LHC has provided an intensive training on the use of Physics Analy...
The CMS Analysis Tools model has now been used robustly in a plethora of physics papers. This model ...
The CERN analysis preservation portal (CAP) comprises a set of tools and services aiming to assist r...
Analyzing physics data at LHC experiments is a complicated task involving multiple steps, sharing of...
The CERN analysis preservation portal (CAP) comprises a set of tools and services aiming to assist r...
The use of virtual data for enhancing the collaboration between large groups of scientists is explor...
To impart hands-on training in physics analysis, CMS experiment initiated the  concept of CMS Data ...
The use of virtual data for enhancing the collaboration between large groups of scientists is explor...
The CMS collaboration is undertaking a big effort to define the analysis model and to develop softwa...
The CMS collaboration is undertaking a big effort to define the analysis model and to develop softwa...
The CMS Data Analysis School is an official event organized by the CMS Collaboration to teach studen...
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, dist...
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, dist...