LHC Experiment’s SoftwareLHC Experiment’s Software
Lucia SilvestrisLucia SilvestrisINFN-Bari INFN-Bari
LISHEP 2006INTERNATIONAL SCHOOL ON
HIGH ENERGY PHYSICSRio de Janeiro - Brazil
Large Hadron Collider & Experiments
The startup
27 km around
LLarge arge HHadron adron CColliderollider
Trigger challenge Trigger challenge task for LHC !!task for LHC !!
LHC Detector RequirementsLHC Detector Requirements
Very good electromagnetic calorimetry for electron and photon identification (H->gamma gamma)
Good hadronic calorimeter jet reconstruction and missing
transverse energy measurement;
Efficient and high-resolution tracking for particle momentum measurements, b-quark tagging, tagging, vertexing (primary and secondary vertex)
Excellent muon identification with precise momentum reconstruction
A Generic Multipurpose LHC A Generic Multipurpose LHC DetectorDetector
µ
en
p
About 10 are needed to shield the muon system from hadrons produced in p-p collision
Experiments at LHCExperiments at LHC
CMS Compact Muon Solenoid ATLAS A Toroidal LHC ApparatuS
LHCb Study of CP violation in B-meson
decays at the LHC
ALICE A Large Ion Collider
Experiment
LHC startup planLHC startup plan
L=3x1028 - 2x1031
Stage 1Initial commissioning
43x43 to 156x156, N=3x1010
Zero to partial squeeze
Stage 1Initial commissioning
43x43 to 156x156, N=3x1010
Zero to partial squeeze
Stage 275 ns operation
936x936, N=3-4x1010
partial squeeze
Stage 275 ns operation
936x936, N=3-4x1010
partial squeeze
L=7x1032 - 2x1033
L=1032 - 4x1032
Stage 325 ns operation
2808x2808, N=3-5x1010
partial to near full squeeze
Stage 325 ns operation
2808x2808, N=3-5x1010
partial to near full squeeze
Pilot RunPilot Run
Pilot Run : Luminosity– 30 days; maybe less (?); 43*43 bunches, then 156*156
bunches PILOT RUN
1.E-04
1.E-03
1.E-02
1.E-01
1.E+00
1.E+01
1.E+021 3 5 7 9
11 13 15 17 19 21 23 25 27 29
DAYS
luminosity (10**30 cm-2 sec-1) integrated luminosity (pb-1)"
events/crossing
1029
1030
1028
1031
Lumi(cm-2s-1)
Pile-up Int. Lumi(pb-1)
10
1
0.1
LHC = 20%(optimistic!)
Startup plan and SoftwareStartup plan and Software
Turn-on is fast– Pile-up increasing rapidly– Timing (43x43 to 75ns to
25 ns) evolution– LOTS of physics
For all detectors:– Commission detector
and readout– Commission trigger
systems– Calibrate/align
detector(s)– Commission computing
and software systems– Rediscover the Standard
Model
SimulationReconstructionTriggerMonitoringCalibration/Alignment
– calculation– application
User-level data objects– selection
AnalysisVisualizationSW Development Tools
LHC startup: CMS/ATLASLHC startup: CMS/ATLAS
Integrated luminosity with the current LHC plans
Run 2008
1.E-01
1.E+00
1.E+01
1.E+02
1.E+03
1.E+04
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
weeks
luminosity (10**30 cm-2 sec-1) integrated luminosity (pb-1)
events/crossing
Top re-discovery
Higgs (?)
Z’ muonsSusy - Susy
1031
Lumi(cm-2s-1)
1032
1033 1.9 fb1.9 fb-1-1
LHC = 30%(optimistic!)
1 fb-1 (optimistic?)
Physics Startup plansPhysics Startup plans
– ALICE: minimum-bias proton-proton interactions• Standard candle for the heavy-ion runs
– LHCb: BS mixing, sin2 repeat
• If the Tevatron has not done it already– ATLAS-CMS: measure jet and Z and W
production; In 15 pb-1 will have 30K W’s and 4K Zs into leptons.• Measure cross sections and W and Z charge
asymmetry (pdfs; top!)– Luminosity?
Multiplicity paper:• Introduction• Detector system
- Pixel (& TPC)• Analysis method• Presentation of data
- dN/dη and mult. distribution (s dependence)
• Theoretical interpretation- ln2(s) scaling?, saturation, multi-parton inter…
• Summary
pT paper outline:• Introduction• Detector system
- TPC, ITS• Analysis method• Presentation of data
- pT spectra and pT-multiplicity correlation
• Theoretical interpretation- soft vs hard, mini-jet production…
• Summary
Startup physics (ALICE)Startup physics (ALICE)
Can publish two papers 1-2 weeks after LHC startup
Where we are ?Where we are ?
Common Software
LCG Application AreaLCG Application Area
Deliver the common physics applications software for the
LHC experiments (http://lcgapp.cern.ch/)
Organized to ensure focus on real experiment needs– Experiment-driven requirements and monitoring– Architects in management and execution– Open information flow and decision making – Participation of experiment developers– Frequent releases enabling iterative feedback
Success is defined by adoption and validation of the products by the experiments
– Integration, evaluation, successful deployment
Software Domain DecompositionSoftware Domain Decomposition
Core
PluginMgr Dictionary
MathLibs I/O
Interpreter
GUI 2D Graphics
Geometry Histograms Fitters
Simulation
Foundation Utilities
Engines
Generators
Data Management
Persistency
FileCatalogFramework
DataBase
Grid Services
Batch
Interactive
OS binding
3D Graphics
NTuple Physics
Collections
Conditions
Exper Frameworks
Simulation Program Reconstruction ProgramAnalysis Program
Event Detector Calibration Algorithms
Simplified Software DecompositionSimplified Software Decomposition
non-HEP specificsoftware packages
Exp. Framework
Applications
Core Libraries
SimulationDataMgt.
Distrib.Analysis
Every experiment has a framework for basic services and various specialized frameworks: event model, detector description, visualization, persistency, interactivity, simulation, etc.
Many non-HEP libraries widely used
Applications are built on top of frameworks and implementing the required algorithms
Core libraries and services that are widely used and provide basic functionality
Specialized domains that are common among the experiments
Common SWCommon SW
Experiment SWExperiment SW
Application Area ProjectsApplication Area Projects
ROOT – Core Libraries and Services– Foundation class libraries, math libraries, framework
services, dictionaries, scripting, GUI, graphics, SEAL libraries, etc.
POOL – Persistency Framework– Storage manager, file catalogs, event collections, relational
access layer, conditions database, etc.
SIMU - Simulation project– Simulation framework, physics validation studies, MC event
generators, Garfield, participation in Geant4 and Fluka.
SPI – Software Process Infrastructure– Software and development services: external libraries,
savannah, software distribution, support for build, test, QA, etc.
ROOT: Core Library and servicesROOT: Core Library and services
ROOT activity at CERN fully integrated in the LCG organization (planning, milestones, reviews,
resources, etc.)
– The main change during last year has been the merge of the SEAL and ROOT projects
• Single development team• Adiabatic migration of the software products into a
single set of core software libraries– 50% of the SEAL functionality has been migrated into
ROOT (Mathlib, reflection, python scripting, etc.)
ROOT is now at the “root” of the software for all the LHC experiments
Web Page: http://root.cern.ch/
ROOT: Core Library and servicesROOT: Core Library and services
Current work packages (SW Components)
– BASE: Foundation and system classes, documentation and releases
– DICT: Reflexion system, meta classes, CINT and Python interpreters
– I/O: Basic I/O, Trees, queries– PROOF: parallel ROOT facility, xrootd– MATH: Mathematical libraries, histogramming, fitting– GUI: Graphical User interfaces and Object editors– GRAPHICS: 2-D and 3-D graphics– GEOM: Geometry system– SEAL: Maintenance of the existing SEAL packages
Web Page: http://root.cern.ch/
ROOT: I/OROOT: I/O
Recent developments of ROOT I/O and TreesGeneral I/O
– STL Collections– Data compression using
reduced precision– Alternatives to default
constructors– Other I/O improvements
Save spaceSave space
Increase precision
Increase precision
ROOT:I/OROOT:I/O
TTree Extensions– New Features– Fast Merging– Indexing of TChains– TTree Interface
enhancements– TRef and pool::Reference– Browsing
Browse Extensions:– Split objects– Unsplit objects– Collections
And can now see– Simple member functions– Transient members– Persistent members
Main focus: Consolidation (Thread Safety)Generic Object Reference support
Important features requested by the experiments are implemented
ROOT: MathROOT: Math
New Developments of ROOT Mathematical Libraries
– new Vector package (3D and 4D)
– SMatrix (for small matrices with fixed size)
•Fitting and Minimization •Minuit2 (C++)•Linear Fitter•Robust Fitter•SPlot (unfolding)
ROOT: Graphics - Detector GeometriesROOT: Graphics - Detector Geometries
Alice LHCb
AtlasCMS
ROOT: Graphics - EventsROOT: Graphics - Events
Data ManagementData Management
FILES - based on ROOT I/O– Targeted for complex data structure: event data, analysis data– Based on Reflex object dictionaries – Management of object relationships: file catalogues– Interface to Grid file catalogs and Grid file access
Relational Databases – Oracle, MySQL, SQLite– Suitable for conditions, calibration, alignment, detector
description data - possibly produced by online systems– Complex use cases and requirements, multiple ‘environments’
– difficult to be satisfied by a single solution – Isolating applications from the database implementations with
a standardized relational database interface• facilitate the life of the application developers• no change in the application to run in different
environments• encode “good practices” once for all
– Focus moving into deployment and experiment support
Persistency frameworkPersistency framework
The AA/POOL project is delivering a number of “products” – POOL – Object and references persistency framework– CORAL – Generic database access interface– ORA – Mapping C++ objects into relational database
Oracle
SQLite
MySQL
ROOT I/O
RDBMS
STORAGE MGRCOLLECTIONS
FILE CATALOG
PO
OL
API
USE
R C
OD
E
CO
OL
API
COOLCORAL
– COOL – Detector conditions database
Object storage and references successfully
used in large scale production in ATLAS, CMS, LHCbNeed to focus on database access and deployment in Grid
– basically starting now
http://pool.cern.ch/
CORAL :Generic database access CORAL :Generic database access interfaceinterface
RDBMS Implementation(oracle)
RDBMS Implementation(sqlite)
RDBMS Implementation(frontier)
RDBMS Implementation(mysql)
Authentication Service(xml)
Authentication Service(environment)
Lookup Service(xml)
Lookup Service(lfc)
Relational Serviceimplementation
Monitoring Serviceimplementation
Connection Serviceimplementation
CORAL Interfaces(C++ abstract classes
user-level API)
CORAL C++ types(Row buffers, Blob,Date, TimeStamp,...)
Client Software
CommonImplementation
developer-levelinterfaces
Plug-in libraries, loaded at run-time, interacting only through the interfaceshttp://pool.cern.ch/coral/
CORAL :Generic database access CORAL :Generic database access interfaceinterface
A software system for vendor-neutral access to relational databases
C++, SQL-free API
CORAL integrated in the software of LHC experiments (CMS, ATLAS and LHCb) directly (i.e. on-line applications) and indirectly (COOL,
POOL)
coral::ISchema& schema = session.nominalSchema();coral::TableDescription tableDescription;tableDescription.setName( “T_t” );tableDescription.insertColumn( “I”, “long long” );tableDescription.insertColumn( “X”, “double” );schema.createTable( tableDescription);
CREATE TABLE T_t ( I BIGINT,X DOUBLE
PRECISION)
CREATE TABLE “T_t” ( I NUMBER(20),
X BINARY_DOUBLE)
Example 1: Table creation
Oracle MySQL
Conditions DataBaseConditions DataBase
DataBases to store time varying data
COOL :– holds condition data for
reconstruction and analysis– access data from PVSS, local
file catalog (LFC) and bookeeping
– implementation in ORACLE, MySQL and SQLite
C++ Relational Access (CORAL)
OracleAccess
MySQLAccess
SQLiteAccess
OracleOCI
MyODBCAPI
SQLiteAPI
OracleDB
MySQLDB
SQLiteDB
Relational Database Deployment
and Data Distribution (3D)
Time-varyingmulti-version
data (~offline)
Time-varyingsingle-versiondata (~online)
Conditions DB Access (COOL)
Experiment conditions datacommon software and conventions
Experiment frameworkSub
detector#N
Subdetector
#2
Subdetector
#1
Now in deployment phase (ATLAS and LHCb)fully integrated in experiment frameworksBenefits from other LCG projects
CORAL, SEAL/ROOT and 3D project
http://pool.cern.ch/CondDB/
SimulationSimulation
MC generators– MC generators specialized on different physics domains,
developed by different authors– Needed to guarantee support for the LHC experiments and
collaboration with the authors.
Simulation engines– Geant4 and Fluka are well established products
Common additional utilities required by the experiments– Interoperability between MC generators and simulation
engines– Interactivity, visualization and analysis facilities– Geometry and Event data persistency– Comparison and validation (between engines and real data)
http://lcgapp.cern.ch/project/simu
Simulation framework utilitiesSimulation framework utilities
HepMC: C++ Event Record for Monte Carlo Generators
GDML: Geometry description markup language– Geometry interchange format or geometry source– GDML writer and readers exists for Geant4 and ROOT
Geant4 Geometry persistency– Saving/retrieving Geant4 geometries with ROOT I/O
FLUGG: using Geant4 geometry from FLUKA– Framework for comparing simulations– Example applications have been developed
Python interface to Geant4– Provide Python bindings to G4 classes– Steering Geant4 applications from Python scripts
Utilities for MC truth handling
Simulation componentsSimulation components
Geant4
Fluka
Flugg
GDML
R W
WR
Steering Python scripts
texteditor
TGeom
PythiaPythia
MC generators
geom.root
MCDBMC
truthroot
HepMC
HepMC
Distributed data analysisDistributed data analysis
Full spectrum of different analysis applications will be co-
existing– Data analysis applications using the full functionality
provided by the experiment’s framework (analysis tools, databases, etc.)
• Requiring big fraction of the available software packages and very demanding on computing and I/O
• Typically batch processing– Final analysis of ntuple-like data (ROOT trees)
• Fast turn-around (interactive)• Easy migration from local or distributed (PROOF)
Tools to help the Physicists are being made available– Large scale grid job submission (GANGA)– Parallelization of the analysis jobs (PROOF)
Application Area Highlights - SPIApplication Area Highlights - SPI
SPI is concentrating on the following areas:– Savannah service (bug tracking, task
management, etc.)• >160 hosted projects, >1350 registered
users (doubled in one year) • Web Page: http://savannah.cern.ch/
– Software services (installation and distribution of software)• >90 external packages installed in the
external service– Software development service
• Tools for development, testing, profiling, QA– Web, Hypernews, Documentation
SPI Web Page http://lcgapp.cern.ch/project/spi/
SPI - Software ConfigurationSPI - Software Configuration
An LCG configuration is a combination of packages and versions which are coherent and compatibleConfigurations are given names like “LCG_40”Experiments build their application software based on a given LCG configuration
– Interfaces to the experiments configuration systems are provided (SCRAM, CMT)
– Concurrent configurations are everyday situation
Configurations are decided in the AF
SPI - Software ReleasesSPI - Software Releases
The AA/Experiments software stack is quite large and complex
– Many steps and many teams are involved
Only 2-3 production quality releases per year is
affordable– Complete documentation, complete
platform set, complete regression tests, test coverage, etc.
Feedback is required before the production release is made
– No clear solution on how to achieve this– Currently under discussion
As often as needed bug fix releases– Quick reaction time and minimal time to
release
non-HEP specificsoftware packages
Exp. Framework
Applications
Core Libraries
SimulationDataMgt.
Distrib.Analysis
rele
ase
ord
er
Where we are?Where we are?
Individual experiments
Software Domain DecompositionSoftware Domain Decomposition
Core
PluginMgr Dictionary
MathLibs I/O
Interpreter
GUI 2D Graphics
Geometry Histograms Fitters
Simulation
Foundation Utilities
Engines
Generators
Data Management
Persistency
FileCatalogFramework
DataBase
Grid Services
Batch
Interactive
OS binding
3D Graphics
NTuple Physics
Collections
Conditions
Exper Frameworks
Simulation Program Reconstruction ProgramAnalysis Program
Event Detector Calibration Algorithms
Experiments Software Architecture
&Frameworks
Frameworks: ATLAS+LHCb (I)Frameworks: ATLAS+LHCb (I)
ATLAS+LHCb: Athena/Gaudi
Converter
Algorithm
Event DataService
PersistencyService
DataFiles
AlgorithmAlgorithm
Transient Event Store
Detec. DataService
PersistencyService
DataFiles
Transient Detector
Store
MessageService
JobOptionsService
Particle Prop.Service
OtherServices
HistogramService
PersistencyService
DataFiles
TransientHistogram
Store
ApplicationManager
ConverterConverter
Frameworks: Alice (II)Frameworks: Alice (II)
Framework CMS: Component Framework CMS: Component Architecture (III)Architecture (III)
CMS: New framework in 2005
Five types of dynamically loadable processing components–Source
•Provides the Event to be processed
–OutputModule•Stores the data from the Event
–Producer•Creates new data to be placed in the Event
–Filter•Decides if processing should continue for an Event
–Analyzer•Studies properties of the Event
Components only communicate via the Event
Components are configured at the start of a job using a ParameterSet
Framework CMS: Processing Model (IV)Framework CMS: Processing Model (IV)
Source creates the Event
The Event is passed to execution paths
Path is an ordered list of Producer/Filter/Analyzer modules
Producers add data to the Event
OutputModule given Event if certain Paths run to completion
POOL FilePOOL File
Framework CMS: Accessing Event Data Framework CMS: Accessing Event Data (VI)(VI)
Event class allows multiple ways to access data //Ask by module label and default product label Handle<TrackVector> trackPtr; event.getByLabel(“tracker”, trackPtr );
//Ask by module and product label Handle<SimHitVector> simPtr; event.getByLabel(“detsim”,“pixel” ,simPtr );
//Ask by type vector<Handle<SimHitVector> > allPtr; event.getByType( allPtr );
//Ask by Selector ParameterSelector<int> coneSel(“coneSize”,5); Handle<JetVector> jetPtr; event.get( coneSel, jetPtr );
Framework CMS: Job Configuration (IX)Framework CMS: Job Configuration (IX)
Job configuration is done in the configuration file
After configuration is complete, all components will have been loaded into the application
process RECO = { source = PoolSource { string filename = “test.root” }
module tracker = TrackFinderProducer {}
module out = PoolOutputModule {
string filename = “test2.root”} path p = {tracker,out}}
Simulation and Detector Simulation and Detector DescriptionDescription
in the experimentsin the experiments
Simulation (I)Simulation (I)
Geant4: success story; Deployed by all experiments.– Functionality essentially complete. Detailed physics
studies performed by all experiments. • Very reliable in production (better than 1:105)
– Good collaboration between experiments and Geant4 team
– Lots of feedback on physics (e.g. from testbeams)– LoH (Level of Happiness): very high
LHCb : ~ 18 million volumes ALICE : ~3 million volumes
Simulation: ATLAS (II)Simulation: ATLAS (II)
Atlas Detector DescriptionAtlas Detector Description
Simulation: ATLAS (III)Simulation: ATLAS (III)
Simulation: Alice (IV)Simulation: Alice (IV)
FLUKA VMC implementation completed
Testing well advanced– TGeo/FLUKA validation
completed– Good agreement with
G3 and Testbeam
FLUKA VMC will be used in the next ALICE Physics data challenge
Plan to use Geant4 as alternative simulation engine
under developement
Simulation : CMS (V)Simulation : CMS (V)
The CMS detector description system (DDD) provides an application-
independent way to describe the geometry– Simulation, Reconstruction, Event Display etc. use by
definition the same geometry
Geometry data are stored in a database with a Hierarchical Versioning
SystemAlignment corrections are applied with reference to a given
baseline geometry
Simulation : CMS (VI)Simulation : CMS (VI)
Event generator framework interfaces multiple packages– including the Genser distribution provided by LCG-AA
Simulation with Geant4 since end 2003– >100M events fully simulated up to now since mid-2005
• 1/106 crashes in latest productions
Digitization tested and tuned with Test Beam
Detector Simulation
Generation Digitization
Hit collection.Hit object with timing, position, energy loss info. Based on Geant4
Digi CollectionDigi objects which include realistic modeling of electronicsignal.
MC truth collection include info from particle gun or physics generator about vertices and particles. Stored in HepMC
format.
Simulation (II)Simulation (II)
Tuning to data: ongoing. Very good progress made
CMS HCAL:Brass/Scintillator
ATLAS Tilecal: Fe/Scintillator
Geant4 / data for e/
GEANT4 – Improvements in Geant4.8GEANT4 – Improvements in Geant4.8
Improvements in multiple scattering process– Addressing issues with ‘electron transport’
Speedups for initialisation/navigation– Option to only re-optimise parts that change
with run– New voxelisation options being studied for
regular geometries
Overlap checks at geometry constructionRevised implementation of particles
– Impacting advanced users, customizing Refinements in hadronic physics
End Lecture 1End Lecture 1