Abstract
The Large Hadron Collider (LHC) at CERN/Geneva is about to deliver first collision data to the CMS experiment. The CMS computing, software and analysis projects will need to meet the expected performances in terms of archiving of raw detector data, prompt calibration and primary reconstruction at the host laboratory, data distribution and handling at Tier-1 centres, data access and distributed physics analysis at Tier-2 centres. Hundreds of physicists located in nearly 200 institutions around the world will then expect to find the necessary infrastructure to easily access and process experimental data, including a large range of activities from low-level detector performance evaluation to involved discovery physics analysis. In the past two years, CMS has conducted computing, software, and analysis challenges to demonstrate the functionality, scalability, and usability of the computing and software components. These challenges have been designed to validate the CMS distributed computing model by demonstrating the functionality of many components simultaneously. We will present the major boost demonstrated by CMS in event processing, data transfers and analysis processing during the CSA07 and ongoing CCRC08 data challenges. In particular, we will describe relevant functional tests and the scale achieved from each CMS component and externally provided component. We will also summarize the main physics analysis lessons drawn from these challenges and the on-going tunings for an optimal begin of the experiment. We inform the reader that the present report contains updated results from the year 2008 in comparison with those presented during the conference.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
The CMS Collaboration: CMS: The computing project. Technical design report, CERNLHCC- 2005-023, Jun 2005. 166pp.
T. Barrass et al: Software Agents in Data and Workload Management, Published in *Interlaken 2004, Computing in high energy physics and nuclear physics* 838–841 Proceedings of the CHEP04 Conference. Interlaken, Switzerland, September 27th.
D.Spiga et al.: The CMS Remote Analysis Builder (CRAB), Lecture notes in computer science, 2007. vol. 4873, pp. 580–586
A. Delgado Peris et al.: Data location, transfer and bookkeeping in CMS, 18th Hadron Collider Physics Symposium 2007 (HCP 2007) 20–26 May 2007, La Biodola, Isola d'Elba, Italy. Published in Nucl.Phys.Proc.Suppl.177–178:279–280, 2008.
CMS Collaboration: CMS Physics TDR, Volume II, J. Phys. G: Nucl. Part. Phys. 34 995–1579
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag US
About this paper
Cite this paper
Kreuzer, P. (2010). Distributed Computing and Data Analysis for CMS in View of the LHC Startup. In: Lin, S., Yen, E. (eds) Production Grids in Asia. Springer, Boston, MA. https://doi.org/10.1007/978-1-4419-0046-3_7
Download citation
DOI: https://doi.org/10.1007/978-1-4419-0046-3_7
Published:
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4419-0102-6
Online ISBN: 978-1-4419-0046-3
eBook Packages: Computer ScienceComputer Science (R0)