• Complain

Carminati Federico - From the Web to the Grid and Beyond: Computing Paradigms Driven by High-Energy Physics

Here you can read online Carminati Federico - From the Web to the Grid and Beyond: Computing Paradigms Driven by High-Energy Physics full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. City: Berlin;Heidelberg, year: 2012, publisher: Springer Berlin Heidelberg, genre: Politics. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:

Romance novel Science fiction Adventure Detective Science History Home and family Prose Art Politics Computer Non-fiction Religion Business Children Humor

Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.

Carminati Federico From the Web to the Grid and Beyond: Computing Paradigms Driven by High-Energy Physics

From the Web to the Grid and Beyond: Computing Paradigms Driven by High-Energy Physics: summary, description and annotation

We offer to read an annotation, description, summary or preface (depends on what the author of the book "From the Web to the Grid and Beyond: Computing Paradigms Driven by High-Energy Physics" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.

Carminati Federico: author's other books


Who wrote From the Web to the Grid and Beyond: Computing Paradigms Driven by High-Energy Physics? Find out the surname, the name of the author of the book and a list of all author's works by series.

From the Web to the Grid and Beyond: Computing Paradigms Driven by High-Energy Physics — read online for free the complete book (whole text) full work

Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "From the Web to the Grid and Beyond: Computing Paradigms Driven by High-Energy Physics" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.

Light

Font size:

Reset

Interval:

Bookmark:

Make
Ren Brun , Federico Carminati and Giuliana Galli Carminati (eds.) The Frontiers Collection From the Web to the Grid and Beyond Computing Paradigms Driven by High-Energy Physics 10.1007/978-3-642-23157-5_1 Springer-Verlag Berlin Heidelberg 2011
1. Technologies, Collaborations and Languages: 20 Years of HEP Computing
Ren Brun 1
(1)
CERN, Geneva, Switzerland
Ren Brun
Email:
Abstract
Research in HEP cannot be done without computers. The statistical nature of the data analysis process, the sheer amount of data to be processed and the complexity of the algorithms involved, to be repeated several times over millions of single collision data require large amounts of computing power. However the sheer computing power is only one part of the story. The data treatment required to extract the physics results from the data is very specific to the discipline, as are data formats and algorithms. The consequence is that HEP physicists have to develop most of their code in house, and they can only very rarely rely on commercial products. This has led HEP to develop very large and complex software systems for data simulation, reconstruction and analysis.
Research in HEP cannot be done without computers. The statistical nature of the data analysis process, the sheer amount of data to be processed and the complexity of the algorithms involved, to be repeated several times over millions of single collision data require large amounts of computing power. However the sheer computing power is only one part of the story. The data treatment required to extract the physics results from the data is very specific to the discipline, as are data formats and algorithms. The consequence is that HEP physicists have to develop most of their code in house, and they can only very rarely rely on commercial products. This has led HEP to develop very large and complex software systems for data simulation, reconstruction and analysis.
One additional complication comes from that fact that HEP experiments are one-off endeavours, and each one is different from the others, otherwise there would be little point in building and operating it. This hinders code reuse from one experiment to the other and also it imposes additional constraints on the framework. Another complication comes from the fact that HEP is a computer resource hungry activity, where science is actually limited by the amount of computing that can be bought by the available budget. So physicists must be able to move their code to any new hardware and Operating System (OS) that offers the best price-performance ratio appearing on the market.
This chapter tells the history of the evolution of the programmes written by HEP physicists for their research over the last 20 years.
1.1 Introduction
The major components of HEP computing are simulation, reconstruction and analysis. It is important to understand their relations and their respective role in HEP research in order to appreciate the constraints and the requirements that have guided their evolution.
Simulation, the third way to scientific knowledge after theory and experiment.
Modern HEP detectors are huge engineering endeavours. The ATLAS [] methods. The response of the sensitive elements to the passage of the particles is reproduced, as well as the planned electronic treatment of the generated electric signals, and finally the coding of these signals in binary form to be stored in computer files. Millions of events are thus simulated to assess the design of the detector and to train the reconstruction and analysis programmes. Once the detector is built, simulation is still essential. The simulation is compared to the actual experimental results and validated, i.e. tuned to provide results similar to those actually found in reality. This allows to simulation to estimate the corrections to be applied to the observed rate with which a given configuration is detected. By agreeing with what is seen, the simulation allows us to estimate how much is lost due to the detector geometry and efficiency. Simulation is mostly a CPU bound activity, where the input data are limited to small configuration files and the output is similar to the output of the detector.
Reconstruction, from electric signals to particles.
The output of a detector is a set of electric signals generated by the passage of particles through matter. These are the so-called raw data or simply raw. The reconstruction process aims at processing these signals to determine the features of the particles generating them, i.e. their mass, charge, energy and direction. In some sense this process has some relations with image processing, in the sense that the signals are localised in space and time, and they are a sort of pixels that have to be recognised as part of a particle trajectory through space-time, or track. This process is also called tracking or track reconstruction. It is usually a very complex procedure where the information from different detectors is combined via sophisticated statistical methods. The results of this process are the tracks, i.e. a set of parameters identifying a trajectory in space together with their statistical uncertainties and correlations. A very important element in the reconstruction is the calibration and alignment of the detectors. Each sensitive detector element has a specific relation between the intensity of the emitted signal and the physics quantity that is at the origin of the signal (energy, time, velocity, charge). To obtain optimal precision, all detecting elements have to be calibrated in order to give the same response to the same input. Moreover the actual position of each detecting element can differ slightly from its ideal position. Both calibration and alignment of the detectors can be determined from the recorded signals, using recursive procedures which take into account the response to the passing particles. The output of reconstruction are the Event Summary Data (ESD) containing the description of the tracks. Usually raw data are reconstructed several times, as the knowledge and understanding of the detector, as well as the reconstruction algorithms, improve. Depending on the complexity of the algorithms deployed, reconstruction can be an I/O bound or a CPU bound activity, where the input are the raw data and the output the ESD files.
Analysis, the final step before publication.
Once the tracks are identified, together with the degree of statistical uncertainty of the related information, analysis can start. This is the process by which the experimental results are compared with the theory to verify or falsify it. This is also when new phenomena, requiring novel theories, are discovered and described. During the analysis the ESD are read and they are used to produce mono- or multi-dimensional statistical distributions. The identification of the interesting events is made via selections that are called cuts in the HEP jargon. Beyond the isolation of the interesting classes of events, the analysis activity aims at determining the frequency with which these events occur. This is a very delicate operation as the detector efficiency is never 100% and in particular is different for the different event classes. While simulation and reconstruction tend to be centrally organised activities, analysis is both a centrally organised activity and a personal one, performed by each physicist willing to test an hypothesis. Analysis is usually an I/O bound activity, where the input are the ESD, or a subset of them, and the output are various kinds of statistical distributions.
All these activities require heavy usage of computing resources (see Fig. ) and the development of programmes and frameworks specific to HEP. In the rest of this chapter we will describe the story of these programmes over the last 20 years.
Next page
Light

Font size:

Reset

Interval:

Bookmark:

Make

Similar books «From the Web to the Grid and Beyond: Computing Paradigms Driven by High-Energy Physics»

Look at similar books to From the Web to the Grid and Beyond: Computing Paradigms Driven by High-Energy Physics. We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.


Reviews about «From the Web to the Grid and Beyond: Computing Paradigms Driven by High-Energy Physics»

Discussion, reviews of the book From the Web to the Grid and Beyond: Computing Paradigms Driven by High-Energy Physics and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.