The distributed system CHARM [1] for data processing and analysis in particles physics experiments is advanced. An eective solution of a two-uniform task of processing great volumes of the information and support of high-speed computing processes is realized on the basis of an inhomogeneous computer platform. The scheme of the computer platform is shown on the gure: Fig.1: The structure of computer platform CHARM-2003 system At some decrease in the level of protecting the resources shown in the top part of the gure, and use of elements of infrastructure GRID, developed in LIT JINR, the conguration of the computer platform gets signicantly simpler (Fig.2). The created system provides processing big data sets (TerraByte range) due to integrat...
The computer control system for the K9 separated beam is described. This beamline is a multi-energy/...
Computing in the field of high energy physics requires usage of heterogeneous computing resources an...
Rapid increase of data volume from the experiments running at the Large Hadron Collider (LHC) prompt...
The advantages and disadvantages of a stored-program digital computer as a data gathering, analyzing...
We are proposing to develop a fault-tolerant distributed computing and database system for use in Hi...
The main activities of the Laboratory of Information Technologies (LIT) of the Joint Institute for N...
Short descriptions of the U-70 accelerator complex treated as a fast cyclic technological process an...
The computing infrastructures of the modern high energy physics experiments need to address an unpre...
The selection and analysis of detector events of the heavy ion collider experiment ALICE at CERN are...
The control system has been designed for the 1.8GeV synchrotron radiation source at Tohoku-Universit...
SuperB is an international enterprise aiming at the construction of a very high luminosity asymmetri...
The analysis of data produced by Isabelle experiments will need a large system of computers. Include...
Huge data volumes of Large Hadron Collider experiments require parallel end-user analysis on cluster...
Since the early appearance of commodity hardware, the utilization of computers rose rapidly, and the...
AbstractThe paper summarizes general information, the latest results and development plans of the BE...
The computer control system for the K9 separated beam is described. This beamline is a multi-energy/...
Computing in the field of high energy physics requires usage of heterogeneous computing resources an...
Rapid increase of data volume from the experiments running at the Large Hadron Collider (LHC) prompt...
The advantages and disadvantages of a stored-program digital computer as a data gathering, analyzing...
We are proposing to develop a fault-tolerant distributed computing and database system for use in Hi...
The main activities of the Laboratory of Information Technologies (LIT) of the Joint Institute for N...
Short descriptions of the U-70 accelerator complex treated as a fast cyclic technological process an...
The computing infrastructures of the modern high energy physics experiments need to address an unpre...
The selection and analysis of detector events of the heavy ion collider experiment ALICE at CERN are...
The control system has been designed for the 1.8GeV synchrotron radiation source at Tohoku-Universit...
SuperB is an international enterprise aiming at the construction of a very high luminosity asymmetri...
The analysis of data produced by Isabelle experiments will need a large system of computers. Include...
Huge data volumes of Large Hadron Collider experiments require parallel end-user analysis on cluster...
Since the early appearance of commodity hardware, the utilization of computers rose rapidly, and the...
AbstractThe paper summarizes general information, the latest results and development plans of the BE...
The computer control system for the K9 separated beam is described. This beamline is a multi-energy/...
Computing in the field of high energy physics requires usage of heterogeneous computing resources an...
Rapid increase of data volume from the experiments running at the Large Hadron Collider (LHC) prompt...