Abstract High data volumes and data throughput are a central feature of the CMS detector experiment in the search for new physics. The aim of this project is to develop prototype systems capable of speeding up and improving the quasi-real-time analyses performed by the triggers during the data-acquisition stage of the experiment. This is of importance as the high luminosity upgrade of the LHC is expected to increase the raw data throughput significantly. The options explored to improve the trigger farm performance are the use of GPUs for parallelization of razor variable analysis, and inference based on distributed machine learning algorithms
Beginning in 2021, the upgraded LHCb experiment will use a triggerless readout system collecting dat...
The LHCb detector at the LHC is a general purpose detector in the forward region with a focus on rec...
The high luminosity expected from the LHC during the Run 3 and, especially, the Phase II of data tak...
We show how an event topology classification based on deep learning could be used to improve the pur...
Beginning in 2021, the upgraded LHCb experiment will use a triggerless readout system collecting dat...
The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) asse...
The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) asse...
In this note we will discuss the application of new technologies, such as GPU cards, in the current ...
Run 2 of the LHC represents one of the most challenging scientific environments for real time data a...
Machine learning algorithms are gaining ground in high energy physics for applications in particle a...
The Large Hadron Collider at CERN will undergo an upgrade in 2027 to increase the integrated luminos...
The High Luminosity LHC (HL-LHC) is a project to increase the luminosity of the Large Hadron Collide...
The LHCb experiment is designed to search for new physical phenomena in proton-proton collisions at ...
The High-Luminosity LHC (HL-LHC) will open an unprecedented window on the weak-scale nature of the u...
At the upcoming large hadron collider (LHC) at CERN one expects to measure 20,000 particles in a sin...
Beginning in 2021, the upgraded LHCb experiment will use a triggerless readout system collecting dat...
The LHCb detector at the LHC is a general purpose detector in the forward region with a focus on rec...
The high luminosity expected from the LHC during the Run 3 and, especially, the Phase II of data tak...
We show how an event topology classification based on deep learning could be used to improve the pur...
Beginning in 2021, the upgraded LHCb experiment will use a triggerless readout system collecting dat...
The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) asse...
The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider (LHC) asse...
In this note we will discuss the application of new technologies, such as GPU cards, in the current ...
Run 2 of the LHC represents one of the most challenging scientific environments for real time data a...
Machine learning algorithms are gaining ground in high energy physics for applications in particle a...
The Large Hadron Collider at CERN will undergo an upgrade in 2027 to increase the integrated luminos...
The High Luminosity LHC (HL-LHC) is a project to increase the luminosity of the Large Hadron Collide...
The LHCb experiment is designed to search for new physical phenomena in proton-proton collisions at ...
The High-Luminosity LHC (HL-LHC) will open an unprecedented window on the weak-scale nature of the u...
At the upcoming large hadron collider (LHC) at CERN one expects to measure 20,000 particles in a sin...
Beginning in 2021, the upgraded LHCb experiment will use a triggerless readout system collecting dat...
The LHCb detector at the LHC is a general purpose detector in the forward region with a focus on rec...
The high luminosity expected from the LHC during the Run 3 and, especially, the Phase II of data tak...