Increased operational effectiveness and the dynamic integration of only temporarily available compute resources (opportunistic resources) becomes more and more important in the next decade, due to the scarcity of resources for future high energy physics experiments as well as the desired integration of cloud and high performance computing resources. This results in a more heterogenous compute environment, which gives rise to huge challenges for the computing operation teams of the experiments. At the Karlsruhe Institute of Technology (KIT) we design solutions to tackle these challenges. In order to ensure an efficient utilization of opportunistic resources and unified access to the entire infrastructure, we developed the Transparent Adaptiv...
LHC experiments require significant computational resources for Monte Carlo simulations and real dat...
Over the last few decades, the needs of computational power and data storage by collaborative, distr...
A Distributed System is composed by integration between loosely coupled software components and the ...
Increased operational effectiveness and the dynamic integration of only temporarily available comput...
Demand for computing resources in high energy physics (HEP) shows a highly dynamic behavior, while t...
The inclusion of opportunistic resources, for example from High Performance Computing (HPC) centers ...
The inclusion of opportunistic resources, for example from High Performance Computing (HPC) centers ...
The current experiments in high energy physics (HEP) have a huge data rate. To convert the measured ...
As results of the excellent LHC performance in 2016, more data than expected has been recorded leadi...
To satisfy future computing demands of the Worldwide LHC Computing Grid (WLCG), opportunistic usage ...
With ever-greater computing needs and fixed budgets, big scientific experiments are turning to oppor...
Modern High Energy Physics (HEP) requires large-scale processing of extensive amounts of scientific...
With the ever-growing amount of data collected with the experiments at the Large Hadron Collider (LH...
Computing resource needs are expected to increase drastically in the future. The HEP experiments ATL...
The German CMS community (DCMS) as a whole can benefit from the various compute resources, available...
LHC experiments require significant computational resources for Monte Carlo simulations and real dat...
Over the last few decades, the needs of computational power and data storage by collaborative, distr...
A Distributed System is composed by integration between loosely coupled software components and the ...
Increased operational effectiveness and the dynamic integration of only temporarily available comput...
Demand for computing resources in high energy physics (HEP) shows a highly dynamic behavior, while t...
The inclusion of opportunistic resources, for example from High Performance Computing (HPC) centers ...
The inclusion of opportunistic resources, for example from High Performance Computing (HPC) centers ...
The current experiments in high energy physics (HEP) have a huge data rate. To convert the measured ...
As results of the excellent LHC performance in 2016, more data than expected has been recorded leadi...
To satisfy future computing demands of the Worldwide LHC Computing Grid (WLCG), opportunistic usage ...
With ever-greater computing needs and fixed budgets, big scientific experiments are turning to oppor...
Modern High Energy Physics (HEP) requires large-scale processing of extensive amounts of scientific...
With the ever-growing amount of data collected with the experiments at the Large Hadron Collider (LH...
Computing resource needs are expected to increase drastically in the future. The HEP experiments ATL...
The German CMS community (DCMS) as a whole can benefit from the various compute resources, available...
LHC experiments require significant computational resources for Monte Carlo simulations and real dat...
Over the last few decades, the needs of computational power and data storage by collaborative, distr...
A Distributed System is composed by integration between loosely coupled software components and the ...